| Non-Rationalised NCERT Books Solution | ||||||
|---|---|---|---|---|---|---|
| 6th | 7th | 8th | 9th | 10th | 11th | 12th |
Chapter 4 Determinants
This solutions guide illuminates Chapter 4: Determinants, a critical concept intrinsically linked to square matrices. While a matrix itself is an array of numbers, its determinant is a unique scalar value computed from its elements according to specific rules. Determinants encapsulate important algebraic and geometric properties of the corresponding matrix and the linear transformation it represents. They serve as powerful tools in linear algebra for solving systems of linear equations, finding the inverse of a matrix, calculating areas and volumes in coordinate geometry, and analyzing vector spaces. This chapter focuses on the methods for calculating determinants, understanding their fundamental properties, and applying them to solve various mathematical problems.
The solutions begin by defining the determinant for square matrices of different orders. For a $1 \times 1$ matrix $[a]$, the determinant is simply $a$. For a $2 \times 2$ matrix $\begin{bmatrix} a & b \\ c & d \end{bmatrix}$, the determinant is calculated as $\det(A) = ad - bc$. For matrices of order $3 \times 3$ and higher, the calculation typically involves a method called expansion by minors and cofactors along any chosen row or column. The solutions meticulously explain how to find the Minor ($M_{ij}$) of an element $a_{ij}$ (the determinant of the submatrix obtained by deleting the $i$-th row and $j$-th column) and the corresponding Cofactor ($C_{ij}$), which is the minor multiplied by an appropriate sign factor: $C_{ij} = (-1)^{i+j}M_{ij}$. The determinant is then the sum of the products of the elements of any row (or column) with their corresponding cofactors (e.g., $\det(A) = a_{11}C_{11} + a_{12}C_{12} + a_{13}C_{13}$ for expansion along the first row of a $3 \times 3$ matrix).
A significant emphasis is placed on understanding and utilizing the numerous properties of determinants, as these often drastically simplify calculations. Key properties demonstrated in the solutions include:
- The determinant remains unchanged if its rows and columns are interchanged ($\det(A) = \det(A^T)$).
- If any two rows (or columns) are interchanged, the sign of the determinant changes.
- If any two rows (or columns) are identical or proportional, the determinant is zero.
- If each element of a row (or column) is multiplied by a constant $k$, the determinant gets multiplied by $k$.
- The crucial property that adding a multiple of one row (or column) to another row (or column) (e.g., applying $R_i \rightarrow R_i + kR_j$ or $C_i \rightarrow C_i + kC_j$) does not change the value of the determinant. This property is extensively used to introduce zeros into a row or column, simplifying subsequent expansion.
- $\det(AB) = \det(A)\det(B)$ for square matrices A and B of the same order.
The solutions showcase how these properties allow for the evaluation of complex determinants often without needing full expansion. Applications of determinants are then explored. One key geometric application is finding the Area of a Triangle with vertices $(x_1, y_1)$, $(x_2, y_2)$, $(x_3, y_3)$, given by the formula $\text{Area} = \frac{1}{2} |x_1(y_2-y_3) + x_2(y_3-y_1) + x_3(y_1-y_2)|$, which can be expressed using a determinant. This also provides a method to check for the collinearity of three points (the area will be zero). Another vital application involves finding the inverse of a square matrix. This first requires calculating the adjoint of the matrix, denoted $\text{adj } A$, which is the transpose of the matrix of cofactors. The solutions demonstrate finding the adjoint and verifying the fundamental relationship $A(\text{adj } A) = (\text{adj } A)A = (\det A)I$, where $I$ is the identity matrix. This relationship directly yields the formula for the inverse of $A$ (if it exists): $A^{-1} = \frac{1}{\det A} (\text{adj } A)$. This formula exists only if $\det A \neq 0$. Matrices with $\det A = 0$ are called singular (non-invertible), while those with $\det A \neq 0$ are non-singular (invertible).
Finally, a major practical application is presented: solving systems of linear equations (e.g., $a_1x+b_1y+c_1z=d_1$, etc.) using the matrix method. The system is written in matrix form $AX = B$, where $A$ is the coefficient matrix, $X$ is the column matrix of variables, and $B$ is the column matrix of constants. If $A$ is non-singular, the unique solution is given by $X = A^{-1}B$. The solutions demonstrate setting up the matrices, finding $A^{-1}$ using the adjoint method, performing the matrix multiplication $A^{-1}B$ to find the values of the variables, and discuss checking the consistency of the system based on the values of $\det(A)$ and $(\text{adj } A)B$.
Example 1 to 5 (Before Exercise 4.1)
Example 1: Evaluate $\begin{vmatrix} 2&4\\−1&2 \end{vmatrix}$ .
Answer:
Given determinant is:
$\begin{vmatrix} 2&4\\−1&2 \end{vmatrix}$
To evaluate a 2x2 determinant $\begin{vmatrix} a&b\\c&d \end{vmatrix}$, we use the formula: $ad - bc$.
In this case, $a=2$, $b=4$, $c=-1$, and $d=2$.
So, the determinant is calculated as:
$(2)(2) - (4)(-1)$
$= 4 - (-4)$
$= 4 + 4$
$= 8$
The value of the determinant is:
$\begin{vmatrix} 2&4\\−1&2 \end{vmatrix} = 8$
Example 2: Evaluate $\begin{vmatrix} x&x+1\\x−1&x \end{vmatrix}$ .
Answer:
Given determinant is:
$\begin{vmatrix} x&x+1\\x−1&x \end{vmatrix}$
To evaluate a 2x2 determinant $\begin{vmatrix} a&b\\c&d \end{vmatrix}$, we use the formula: $ad - bc$.
In this case, $a=x$, $b=x+1$, $c=x-1$, and $d=x$.
So, the determinant is calculated as:
$(x)(x) - (x+1)(x-1)$
Using the algebraic identity $(a+b)(a-b) = a^2 - b^2$:
$= x^2 - ((x)^2 - (1)^2)$
$= x^2 - (x^2 - 1)$
$= x^2 - x^2 + 1$
$= 1$
The value of the determinant is:
$\begin{vmatrix} x&x+1\\x−1&x \end{vmatrix} = 1$
Example 3: Evaluate the determinant ∆ = $\begin{vmatrix} 1&2&4\\−1&3&0\\4&1&0 \end{vmatrix}$ .
Answer:
Given determinant ∆ is:
∆ = $\begin{vmatrix} 1&2&4\\−1&3&0\\4&1&0 \end{vmatrix}$
To evaluate a 3x3 determinant, we can expand it along any row or column. Expanding along a column with zeros will simplify the calculation. Let's expand along the third column (C3).
The formula for expansion along the j-th column is:
∆ = $\sum_{i=1}^{3} (-1)^{i+j} a_{ij} M_{ij}$
Expanding along C3 (where $j=3$):
∆ = $(-1)^{1+3} a_{13} M_{13} + (-1)^{2+3} a_{23} M_{23} + (-1)^{3+3} a_{33} M_{33}$
Here, $a_{13} = 4$, $a_{23} = 0$, $a_{33} = 0$.
$M_{13}$ is the minor obtained by deleting the 1st row and 3rd column:
$M_{13} = \begin{vmatrix} −1&3\\4&1 \end{vmatrix} = (-1)(1) - (3)(4) = -1 - 12 = -13$
Since $a_{23}=0$ and $a_{33}=0$, the terms $(-1)^{2+3} a_{23} M_{23}$ and $(-1)^{3+3} a_{33} M_{33}$ will be zero.
So, ∆ = $(+1) (4) M_{13} + (-1) (0) M_{23} + (+1) (0) M_{33}$
∆ = $4 \times (-13) + 0 + 0$
∆ = $-52$
The value of the determinant is:
∆ = $-52$
Example 4: Evaluate ∆ = $\begin{vmatrix} 0& \sinα & -\cosα \\ −\sinα&0&\sinβ \\ \cosα&-\sinβ&0 \end{vmatrix}$ .
Answer:
Given determinant ∆ is:
∆ = $\begin{vmatrix} 0& \sinα & -\cosα \\ −\sinα&0&\sinβ \\ \cosα&-\sinβ&0 \end{vmatrix}$
To evaluate a 3x3 determinant, we can expand it along any row or column. Let's expand along the first row (R1).
The formula for expansion along the first row is:
∆ = $a_{11}C_{11} + a_{12}C_{12} + a_{13}C_{13}$
Where $C_{ij} = (-1)^{i+j} M_{ij}$ is the cofactor and $M_{ij}$ is the minor.
Here, $a_{11} = 0$, $a_{12} = \sinα$, $a_{13} = -\cosα$.
$C_{11} = (-1)^{1+1} M_{11} = (+1) \begin{vmatrix} 0&\sinβ \\ -\sinβ&0 \end{vmatrix} = (0)(0) - (\sinβ)(-\sinβ) = 0 - (-\sin^2β) = \sin^2β$
$C_{12} = (-1)^{1+2} M_{12} = (-1) \begin{vmatrix} −\sinα&\sinβ \\ \cosα&0 \end{vmatrix} = (-1) ((-\sinα)(0) - (\sinβ)(\cosα)) = (-1) (0 - \sinβ\cosα) = (-1) (-\sinβ\cosα) = \sinβ\cosα$
$C_{13} = (-1)^{1+3} M_{13} = (+1) \begin{vmatrix} −\sinα&0 \\ \cosα&-\sinβ \end{vmatrix} = (+1) ((-\sinα)(-\sinβ) - (0)(\cosα)) = (+1) (\sinα\sinβ - 0) = \sinα\sinβ$
Now substitute these values into the expansion formula:
∆ = $(0) (\sin^2β) + (\sinα) (\sinβ\cosα) + (-\cosα) (\sinα\sinβ)$
∆ = $0 + \sinα\sinβ\cosα - \cosα\sinα\sinβ$
∆ = $\sinα\sinβ\cosα - \sinα\sinβ\cosα$
∆ = $0$
The value of the determinant is:
∆ = $0$
Example 5: Find values of x for which $\begin{vmatrix} 3&x\\x&1 \end{vmatrix}$ = $\begin{vmatrix} 3&2\\4&1 \end{vmatrix}$ .
Answer:
Given that the two determinants are equal:
$\begin{vmatrix} 3&x\\x&1 \end{vmatrix}$ = $\begin{vmatrix} 3&2\\4&1 \end{vmatrix}$
First, evaluate the left determinant $\begin{vmatrix} 3&x\\x&1 \end{vmatrix}$ using the formula $ad-bc$:
$\begin{vmatrix} 3&x\\x&1 \end{vmatrix} = (3)(1) - (x)(x) = 3 - x^2$
Next, evaluate the right determinant $\begin{vmatrix} 3&2\\4&1 \end{vmatrix}$ using the formula $ad-bc$:
$\begin{vmatrix} 3&2\\4&1 \end{vmatrix} = (3)(1) - (2)(4) = 3 - 8 = -5$
Now, set the two determinant values equal to each other as given in the problem:
$3 - x^2 = -5$
Rearrange the equation to solve for $x$:
$x^2 = 3 - (-5)$
$x^2 = 3 + 5$
$x^2 = 8$
Take the square root of both sides:
$x = \pm \sqrt{8}$
Simplify the square root:
$x = \pm \sqrt{4 \times 2}$
$x = \pm 2\sqrt{2}$
The values of x for which the equality holds are:
$x = 2\sqrt{2}$ or $x = -2\sqrt{2}$
Exercise 4.1
Evaluate the determinants in Exercises 1 and 2.
Question 1. $\begin{vmatrix} 2&4\\−5&−1 \end{vmatrix}$
Answer:
Given determinant is:
$\begin{vmatrix} 2&4\\−5&−1 \end{vmatrix}$
To evaluate a 2x2 determinant $\begin{vmatrix} a&b\\c&d \end{vmatrix}$, we use the formula: $ad - bc$.
In this case, $a=2$, $b=4$, $c=-5$, and $d=-1$.
So, the determinant is calculated as:
$(2)(-1) - (4)(-5)$
$= -2 - (-20)$
$= -2 + 20$
$= 18$
The value of the determinant is:
$\begin{vmatrix} 2&4\\−5&−1 \end{vmatrix} = 18$
Question 2.
(i) $\begin{vmatrix} \cosθ&−\sinθ\\\sinθ&\cosθ \end{vmatrix}$
(ii) $\begin{vmatrix} x^2−x+1&x−1\\x+1&x+1 \end{vmatrix}$
Answer:
(i) Given determinant is:
$\begin{vmatrix} \cosθ&−\sinθ\\\sinθ&\cosθ \end{vmatrix}$
To evaluate a 2x2 determinant $\begin{vmatrix} a&b\\c&d \end{vmatrix}$, we use the formula: $ad - bc$.
In this case, $a=\cosθ$, $b=−\sinθ$, $c=\sinθ$, and $d=\cosθ$.
So, the determinant is calculated as:
$(\cosθ)(\cosθ) - (-\sinθ)(\sinθ)$
$= \cos^2θ - (-\sin^2θ)$
$= \cos^2θ + \sin^2θ$
Using the trigonometric identity $\sin^2θ + \cos^2θ = 1$:
$= 1$
The value of the determinant is:
$\begin{vmatrix} \cosθ&−\sinθ\\\sinθ&\cosθ \end{vmatrix} = 1$
(ii) Given determinant is:
$\begin{vmatrix} x^2−x+1&x−1\\x+1&x+1 \end{vmatrix}$
To evaluate a 2x2 determinant $\begin{vmatrix} a&b\\c&d \end{vmatrix}$, we use the formula: $ad - bc$.
In this case, $a=x^2−x+1$, $b=x−1$, $c=x+1$, and $d=x+1$.
So, the determinant is calculated as:
$(x^2−x+1)(x+1) - (x−1)(x+1)$
Use the sum of cubes identity $(a^2-ab+b^2)(a+b) = a^3+b^3$ for the first term, with $a=x$ and $b=1$. Or simply expand it.
$(x^2(x+1) - x(x+1) + 1(x+1)) - (x^2 - 1^2)$
$(x^3 + x^2 - x^2 - x + x + 1) - (x^2 - 1)$
$(x^3 + 1) - (x^2 - 1)$
$= x^3 + 1 - x^2 + 1$
$= x^3 - x^2 + 2$
The value of the determinant is:
$\begin{vmatrix} x^2−x+1&x−1\\x+1&x+1 \end{vmatrix} = x^3 - x^2 + 2$
Question 3. If A = $\begin{bmatrix} 1&2\\4&2 \end{bmatrix}$ , then show that | 2A | = 4 | A |
Answer:
Given:
The matrix A is given as:
$A = \begin{bmatrix} 1 & 2 \\ 4 & 2 \end{bmatrix}$
To Show:
We need to show that $|2A| = 4|A|$.
Solution:
First, we will calculate the determinant of the given matrix A.
$|A| = \det(A) = \det\begin{bmatrix} 1 & 2 \\ 4 & 2 \end{bmatrix}$
The determinant of a $2 \times 2$ matrix $\begin{bmatrix} a & b \\ c & d \end{bmatrix}$ is given by $ad - bc$.
Using this formula for matrix A:
$|A| = (1 \times 2) - (2 \times 4)$
$|A| = 2 - 8$
$|A| = -6$
Next, we will find the matrix 2A by multiplying each element of matrix A by the scalar 2.
$2A = 2 \times \begin{bmatrix} 1 & 2 \\ 4 & 2 \end{bmatrix}$
$2A = \begin{bmatrix} 2 \times 1 & 2 \times 2 \\ 2 \times 4 & 2 \times 2 \end{bmatrix}$
$2A = \begin{bmatrix} 2 & 4 \\ 8 & 4 \end{bmatrix}$
Now, we will calculate the determinant of the matrix 2A.
$|2A| = \det(2A) = \det\begin{bmatrix} 2 & 4 \\ 8 & 4 \end{bmatrix}$
Using the determinant formula for a $2 \times 2$ matrix:
$|2A| = (2 \times 4) - (4 \times 8)$
$|2A| = 8 - 32$
$|2A| = -24$
Finally, we will calculate $4|A|$ and compare it with $|2A|$.
We found earlier that $|A| = -6$.
$4|A| = 4 \times (-6)$
$4|A| = -24$
Comparing the values of $|2A|$ and $4|A|$:
We have $|2A| = -24$ and $4|A| = -24$.
Since $|2A| = -24$ and $4|A| = -24$, it follows that $|2A| = 4|A|$.
Thus, it is shown that $|2A| = 4|A|$ for the given matrix A.
Question 4. If A = $\begin{bmatrix} 1&0&1\\0&1&2\\0&0&4 \end{bmatrix}$ , then show that | 3 A | = 27 | A |
Answer:
Given:
The matrix A is given as:
$A = \begin{bmatrix} 1 & 0 & 1 \\ 0 & 1 & 2 \\ 0 & 0 & 4 \end{bmatrix}$
To Show:
We need to show that $|3A| = 27|A|$.
Solution:
First, we will calculate the determinant of the given matrix A.
$|A| = \det(A) = \det\begin{bmatrix} 1 & 0 & 1 \\ 0 & 1 & 2 \\ 0 & 0 & 4 \end{bmatrix}$
Since A is an upper triangular matrix, its determinant is the product of its diagonal elements.
$|A| = 1 \times 1 \times 4$
$|A| = 4$
Next, we will find the matrix 3A by multiplying each element of matrix A by the scalar 3.
$3A = 3 \times \begin{bmatrix} 1 & 0 & 1 \\ 0 & 1 & 2 \\ 0 & 0 & 4 \end{bmatrix}$
$3A = \begin{bmatrix} 3 \times 1 & 3 \times 0 & 3 \times 1 \\ 3 \times 0 & 3 \times 1 & 3 \times 2 \\ 3 \times 0 & 3 \times 0 & 3 \times 4 \end{bmatrix}$
$3A = \begin{bmatrix} 3 & 0 & 3 \\ 0 & 3 & 6 \\ 0 & 0 & 12 \end{bmatrix}$
Now, we will calculate the determinant of the matrix 3A.
$|3A| = \det(3A) = \det\begin{bmatrix} 3 & 0 & 3 \\ 0 & 3 & 6 \\ 0 & 0 & 12 \end{bmatrix}$
Since 3A is also an upper triangular matrix, its determinant is the product of its diagonal elements.
$|3A| = 3 \times 3 \times 12$
$|3A| = 9 \times 12$
$|3A| = 108$
Finally, we will calculate $27|A|$ and compare it with $|3A|$.
We found earlier that $|A| = 4$.
$27|A| = 27 \times 4$
$27|A| = 108$
Comparing the values of $|3A|$ and $27|A|$:
We have $|3A| = 108$ and $27|A| = 108$.
Since $|3A| = 108$ and $27|A| = 108$, it follows that $|3A| = 27|A|$.
Thus, it is shown that $|3A| = 27|A|$ for the given matrix A.
This also demonstrates the property that for a $n \times n$ matrix A and a scalar $k$, $|kA| = k^n |A|$. In this case, $n=3$ and $k=3$, so $|3A| = 3^3 |A| = 27|A|$.
Question 5. Evaluate the determinants
(i) $\begin{vmatrix} 3&−1&-2\\0&0&−1\\3&−5&0 \end{vmatrix}$
(ii) $\begin{vmatrix} 3&−4&5\\1&1&−2\\2&3&1 \end{vmatrix}$
(iii) $\begin{vmatrix} 0&1&2\\-1&0&−3\\-2&3&0 \end{vmatrix}$
(iv) $\begin{vmatrix} 2&−1&-2\\0&2&−1\\3&−5&0 \end{vmatrix}$
Answer:
(i) Evaluate $\begin{vmatrix} 3&−1&-2\\0&0&−1\\3&−5&0 \end{vmatrix}$
Solution:
We can evaluate the determinant by expanding along the second row (R2), as it contains two zeros. The signs for the terms in the cofactor expansion along R2 are $-, +, -$.
$\begin{vmatrix} 3 & −1 & -2 \\ 0 & 0 & −1 \\ 3 & −5 & 0 \end{vmatrix} = 0 \cdot C_{21} + 0 \cdot C_{22} + (-1) \cdot C_{23}$
Here, $C_{ij} = (-1)^{i+j} M_{ij}$, where $M_{ij}$ is the minor obtained by deleting the i-th row and j-th column.
We only need to calculate $C_{23}$. The element is $-1$ at position (2, 3).
$C_{23} = (-1)^{2+3} M_{23} = (-1)^5 M_{23} = -M_{23}$
$M_{23}$ is the determinant of the matrix obtained by removing row 2 and column 3:
$M_{23} = \det\begin{bmatrix} 3 & -1 \\ 3 & -5 \end{bmatrix} = (3 \times -5) - (-1 \times 3) = -15 - (-3) = -15 + 3 = -12$
So, $C_{23} = -(-12) = 12$.
Now, substitute this back into the expansion:
$\begin{vmatrix} 3 & −1 & -2 \\ 0 & 0 & −1 \\ 3 & −5 & 0 \end{vmatrix} = 0 + 0 + (-1) \cdot 12 = -12$
The value of the determinant is $-12$.
(ii) Evaluate $\begin{vmatrix} 3&−4&5\\1&1&−2\\2&3&1 \end{vmatrix}$
Solution:
We can evaluate the determinant by expanding along the first row (R1). The signs for the terms are $+, -, +$.
$\begin{vmatrix} 3 & −4 & 5 \\ 1 & 1 & −2 \\ 2 & 3 & 1 \end{vmatrix} = 3 \cdot \det\begin{bmatrix} 1 & -2 \\ 3 & 1 \end{bmatrix} - (-4) \cdot \det\begin{bmatrix} 1 & -2 \\ 2 & 1 \end{bmatrix} + 5 \cdot \det\begin{bmatrix} 1 & 1 \\ 2 & 3 \end{bmatrix}$
Calculate the $2 \times 2$ determinants:
$\det\begin{bmatrix} 1 & -2 \\ 3 & 1 \end{bmatrix} = (1 \times 1) - (-2 \times 3) = 1 - (-6) = 1 + 6 = 7$
$\det\begin{bmatrix} 1 & -2 \\ 2 & 1 \end{bmatrix} = (1 \times 1) - (-2 \times 2) = 1 - (-4) = 1 + 4 = 5$
$\det\begin{bmatrix} 1 & 1 \\ 2 & 3 \end{bmatrix} = (1 \times 3) - (1 \times 2) = 3 - 2 = 1$
Substitute these values back into the expansion:
$\begin{vmatrix} 3 & −4 & 5 \\ 1 & 1 & −2 \\ 2 & 3 & 1 \end{vmatrix} = 3 \cdot (7) - (-4) \cdot (5) + 5 \cdot (1)$
$= 21 + 4 \cdot 5 + 5$
$= 21 + 20 + 5$
$= 46$
The value of the determinant is $46$.
(iii) Evaluate $\begin{vmatrix} 0&1&2\\-1&0&−3\\-2&3&0 \end{vmatrix}$
Solution:
We can evaluate the determinant by expanding along the first row (R1). The signs for the terms are $+, -, +$.
$\begin{vmatrix} 0 & 1 & 2 \\ -1 & 0 & −3 \\ -2 & 3 & 0 \end{vmatrix} = 0 \cdot \det\begin{bmatrix} 0 & -3 \\ 3 & 0 \end{bmatrix} - 1 \cdot \det\begin{bmatrix} -1 & -3 \\ -2 & 0 \end{bmatrix} + 2 \cdot \det\begin{bmatrix} -1 & 0 \\ -2 & 3 \end{bmatrix}$
Calculate the $2 \times 2$ determinants:
$\det\begin{bmatrix} 0 & -3 \\ 3 & 0 \end{bmatrix} = (0 \times 0) - (-3 \times 3) = 0 - (-9) = 9$
$\det\begin{bmatrix} -1 & -3 \\ -2 & 0 \end{bmatrix} = (-1 \times 0) - (-3 \times -2) = 0 - 6 = -6$
$\det\begin{bmatrix} -1 & 0 \\ -2 & 3 \end{bmatrix} = (-1 \times 3) - (0 \times -2) = -3 - 0 = -3$
Substitute these values back into the expansion:
$\begin{vmatrix} 0 & 1 & 2 \\ -1 & 0 & −3 \\ -2 & 3 & 0 \end{vmatrix} = 0 \cdot (9) - 1 \cdot (-6) + 2 \cdot (-3)$
$= 0 + 6 - 6$
$= 0$
The value of the determinant is $0$.
(iv) Evaluate $\begin{vmatrix} 2&−1&-2\\0&2&−1\\3&−5&0 \end{vmatrix}$
Solution:
We can evaluate the determinant by expanding along the first column (C1). The signs for the terms are $+, -, +$.
$\begin{vmatrix} 2 & −1 & -2 \\ 0 & 2 & −1 \\ 3 & −5 & 0 \end{vmatrix} = 2 \cdot \det\begin{bmatrix} 2 & -1 \\ -5 & 0 \end{bmatrix} - 0 \cdot \det\begin{bmatrix} -1 & -2 \\ -5 & 0 \end{bmatrix} + 3 \cdot \det\begin{bmatrix} -1 & -2 \\ 2 & -1 \end{bmatrix}$
Note that the second term is zero because the element is zero.
Calculate the $2 \times 2$ determinants:
$\det\begin{bmatrix} 2 & -1 \\ -5 & 0 \end{bmatrix} = (2 \times 0) - (-1 \times -5) = 0 - 5 = -5$
$\det\begin{bmatrix} -1 & -2 \\ 2 & -1 \end{bmatrix} = (-1 \times -1) - (-2 \times 2) = 1 - (-4) = 1 + 4 = 5$
Substitute these values back into the expansion:
$\begin{vmatrix} 2 & −1 & -2 \\ 0 & 2 & −1 \\ 3 & −5 & 0 \end{vmatrix} = 2 \cdot (-5) - 0 + 3 \cdot (5)$
$= -10 + 15$
$= 5$
The value of the determinant is $5$.
Question 6. If A = $\begin{bmatrix} 1&1&−2\\2&1&−3\\5&4&−9 \end{bmatrix}$ , find |A|
Answer:
Given:
The matrix A is given as:
$A = \begin{bmatrix} 1 & 1 & -2 \\ 2 & 1 & -3 \\ 5 & 4 & -9 \end{bmatrix}$
To Find:
We need to find the determinant of matrix A, denoted as $|A|$.
Solution:
We can evaluate the determinant of the $3 \times 3$ matrix by expanding along the first row (R1). The formula for the determinant expanded along the first row is:
$|A| = a_{11} \cdot C_{11} + a_{12} \cdot C_{12} + a_{13} \cdot C_{13}$
where $a_{ij}$ are the elements of the matrix and $C_{ij}$ are the corresponding cofactors. The cofactors are given by $C_{ij} = (-1)^{i+j} M_{ij}$, where $M_{ij}$ is the minor (determinant of the submatrix obtained by deleting the i-th row and j-th column).
The elements of the first row are $a_{11}=1$, $a_{12}=1$, and $a_{13}=-2$.
Now, we calculate the minors and cofactors:
For $a_{11}=1$:
$M_{11} = \det\begin{bmatrix} 1 & -3 \\ 4 & -9 \end{bmatrix} = (1 \times -9) - (-3 \times 4) = -9 - (-12) = -9 + 12 = 3$
$C_{11} = (-1)^{1+1} M_{11} = (+1) \times 3 = 3$
For $a_{12}=1$:
$M_{12} = \det\begin{bmatrix} 2 & -3 \\ 5 & -9 \end{bmatrix} = (2 \times -9) - (-3 \times 5) = -18 - (-15) = -18 + 15 = -3$
$C_{12} = (-1)^{1+2} M_{12} = (-1) \times (-3) = 3$
For $a_{13}=-2$:
$M_{13} = \det\begin{bmatrix} 2 & 1 \\ 5 & 4 \end{bmatrix} = (2 \times 4) - (1 \times 5) = 8 - 5 = 3$
$C_{13} = (-1)^{1+3} M_{13} = (+1) \times 3 = 3$
Now, substitute these values into the determinant formula:
$|A| = a_{11} \cdot C_{11} + a_{12} \cdot C_{12} + a_{13} \cdot C_{13}$
$|A| = 1 \cdot (3) + 1 \cdot (3) + (-2) \cdot (3)$
$|A| = 3 + 3 - 6$
$|A| = 6 - 6$
$|A| = 0$
The value of the determinant of matrix A is $0$.
Question 7. Find values of x, if
(i) $\begin{vmatrix} 2&4\\5&1 \end{vmatrix}$ = $\begin{vmatrix} 2x&4\\6&x \end{vmatrix}$
(ii) $\begin{vmatrix} 2&3\\4&5 \end{vmatrix}$ = $\begin{vmatrix} x&3\\2x&5 \end{vmatrix}$
Answer:
(i) Find x if $\begin{vmatrix} 2&4\\5&1 \end{vmatrix}$ = $\begin{vmatrix} 2x&4\\6&x \end{vmatrix}$
Solution:
We need to evaluate the determinants on both sides of the equation and then solve for $x$.
The determinant of a $2 \times 2$ matrix $\begin{bmatrix} a & b \\ c & d \end{bmatrix}$ is given by $ad - bc$.
Left side determinant:
$\begin{vmatrix} 2&4\\5&1 \end{vmatrix} = (2 \times 1) - (4 \times 5)$
$= 2 - 20$
$= -18$
Right side determinant:
$\begin{vmatrix} 2x&4\\6&x \end{vmatrix} = (2x \times x) - (4 \times 6)$
$= 2x^2 - 24$
Equating the two determinants:
$-18 = 2x^2 - 24$
Now, solve the equation for $x$:
$2x^2 - 24 = -18$
$2x^2 = -18 + 24$
$2x^2 = 6$
$x^2 = \frac{6}{2}$
$x^2 = 3$
Taking the square root of both sides:
$x = \pm\sqrt{3}$
The values of $x$ are $\sqrt{3}$ and $-\sqrt{3}$.
(ii) Find x if $\begin{vmatrix} 2&3\\4&5 \end{vmatrix}$ = $\begin{vmatrix} x&3\\2x&5 \end{vmatrix}$
Solution:
We need to evaluate the determinants on both sides of the equation and then solve for $x$.
The determinant of a $2 \times 2$ matrix $\begin{bmatrix} a & b \\ c & d \end{bmatrix}$ is given by $ad - bc$.
Left side determinant:
$\begin{vmatrix} 2&3\\4&5 \end{vmatrix} = (2 \times 5) - (3 \times 4)$
$= 10 - 12$
$= -2$
Right side determinant:
$\begin{vmatrix} x&3\\2x&5 \end{vmatrix} = (x \times 5) - (3 \times 2x)$
$= 5x - 6x$
$= -x$
Equating the two determinants:
$-2 = -x$
Multiplying both sides by -1:
$x = 2$
The value of $x$ is $2$.
Question 8. If $\begin{vmatrix} x&2\\18&x \end{vmatrix}$ = $\begin{vmatrix} 6&2\\18&6 \end{vmatrix}$ , then x is equal to
(A) 6
(B) ± 6
(C) – 6
(D) 0
Answer:
Given:
The equation involving determinants is:
$\begin{vmatrix} x&2\\18&x \end{vmatrix} = \begin{vmatrix} 6&2\\18&6 \end{vmatrix}$
To Find:
We need to find the value(s) of $x$ that satisfy the given equation.
Solution:
We will evaluate the determinant on both sides of the equation.
The determinant of a $2 \times 2$ matrix $\begin{bmatrix} a & b \\ c & d \end{bmatrix}$ is given by $ad - bc$.
Evaluate the determinant on the left side:
$\begin{vmatrix} x&2\\18&x \end{vmatrix} = (x \times x) - (2 \times 18)$
$= x^2 - 36$
Evaluate the determinant on the right side:
$\begin{vmatrix} 6&2\\18&6 \end{vmatrix} = (6 \times 6) - (2 \times 18)$
$= 36 - 36$
$= 0$
Now, equate the two determinant values as given in the problem:
$x^2 - 36 = 0$
Solve the resulting equation for $x$:
$x^2 = 36$
Taking the square root of both sides:
$x = \pm\sqrt{36}$
$x = \pm 6$
The values of $x$ that satisfy the equation are $x = 6$ and $x = -6$.
Comparing this result with the given options, the correct option is (B).
The final answer is $\pm 6$.
Example 6 to 16 (Before Exercise 4.2)
Example 6: Verify Property 1 for ∆ = $\begin{vmatrix} 2&−3&5\\6&0&4\\1&5&−7 \end{vmatrix}$
Answer:
Here is the verification of Property 1 for the given determinant.
Given:
The determinant $\Delta$ is given by:
$\Delta = \begin{vmatrix} 2& -3& 5 \\ 6& 0& 4 \\ 1& 5& -7 \end{vmatrix}$
To Verify:
Property 1 of determinants.
Property 1:
The value of a determinant remains unchanged if its rows and columns are interchanged (i.e., if the transpose of the matrix is taken).
Symbolically, if A is a square matrix, then $\det(A) = \det(A')$, where A' is the transpose of A.
Verification:
First, we calculate the value of the given determinant $\Delta$. We can expand along the first row:
$\det(\Delta) = 2 \begin{vmatrix} 0 & 4 \\ 5 & -7 \end{vmatrix} - (-3) \begin{vmatrix} 6 & 4 \\ 1 & -7 \end{vmatrix} + 5 \begin{vmatrix} 6 & 0 \\ 1 & 5 \end{vmatrix}$
$\det(\Delta) = 2 ((0)(-7) - (4)(5)) + 3 ((6)(-7) - (4)(1)) + 5 ((6)(5) - (0)(1))$
$\det(\Delta) = 2 (0 - 20) + 3 (-42 - 4) + 5 (30 - 0)$
$\det(\Delta) = 2 (-20) + 3 (-46) + 5 (30)$
$\det(\Delta) = -40 - 138 + 150$
$\det(\Delta) = -178 + 150$
$\det(\Delta) = -28$
Next, we find the transpose of the given matrix. The transpose $\Delta'$ is obtained by interchanging the rows and columns of $\Delta$.
$\Delta' = \begin{vmatrix} 2& 6& 1 \\ -3& 0& 5 \\ 5& 4& -7 \end{vmatrix}$
Now, we calculate the value of the determinant of the transpose, $\det(\Delta')$. We can expand along the second row to simplify calculation due to the presence of zero:
$\det(\Delta') = -(-3) \begin{vmatrix} 6 & 1 \\ 4 & -7 \end{vmatrix} + 0 \begin{vmatrix} 2 & 1 \\ 5 & -7 \end{vmatrix} - 5 \begin{vmatrix} 2 & 6 \\ 5 & 4 \end{vmatrix}$
$\det(\Delta') = 3 ((6)(-7) - (1)(4)) + 0 - 5 ((2)(4) - (6)(5))$
$\det(\Delta') = 3 (-42 - 4) - 5 (8 - 30)$
$\det(\Delta') = 3 (-46) - 5 (-22)$
$\det(\Delta') = -138 - (-110)$
$\det(\Delta') = -138 + 110$
$\det(\Delta') = -28$
Conclusion:
We found that the value of the original determinant is $\det(\Delta) = -28$, and the value of the determinant of its transpose is $\det(\Delta') = -28$.
Since $\det(\Delta) = \det(\Delta')$, Property 1 is verified for the given determinant.
Example 7: Verify Property 2 for ∆ = $\begin{vmatrix} 2&−3&5\\6&0&4\\1&5&−7 \end{vmatrix}$ .
Answer:
Here is the verification of Property 2 for the given determinant.
Given:
The determinant $\Delta$ is given by:
$\Delta = \begin{vmatrix} 2& -3& 5 \\ 6& 0& 4 \\ 1& 5& -7 \end{vmatrix}$
To Verify:
Property 2 of determinants.
Property 2:
If any two rows (or columns) of a determinant are interchanged, then the sign of the determinant changes.
Verification:
First, we calculate the value of the original determinant $\Delta$. As calculated in Example 6, the value is:
$\det(\Delta) = -28$
Now, let's interchange R1 and R2 to create a new determinant, say $\Delta_1$.
$\Delta_1 = \begin{vmatrix} 6& 0& 4 \\ 2& -3& 5 \\ 1& 5& -7 \end{vmatrix}$
Next, we calculate the value of the new determinant $\det(\Delta_1)$. We can expand along the first row:
$\det(\Delta_1) = 6 \begin{vmatrix} -3 & 5 \\ 5 & -7 \end{vmatrix} - 0 \begin{vmatrix} 2 & 5 \\ 1 & -7 \end{vmatrix} + 4 \begin{vmatrix} 2 & -3 \\ 1 & 5 \end{vmatrix}$
$\det(\Delta_1) = 6 ((-3)(-7) - (5)(5)) - 0 + 4 ((2)(5) - (-3)(1))$
$\det(\Delta_1) = 6 (21 - 25) + 4 (10 - (-3))$
$\det(\Delta_1) = 6 (-4) + 4 (10 + 3)$
$\det(\Delta_1) = -24 + 4 (13)$
$\det(\Delta_1) = -24 + 52$
$\det(\Delta_1) = 28$
Conclusion:
The value of the original determinant is $\det(\Delta) = -28$.
The value of the determinant after interchanging R1 and R2 is $\det(\Delta_1) = 28$.
We observe that $\det(\Delta_1) = - \det(\Delta)$ ($28 = -(-28)$).
Thus, Property 2 is verified for the given determinant.
Example 8: Evaluate ∆ = $\begin{vmatrix} 3&2&3\\2&2&3\\3&2&3 \end{vmatrix}$
Answer:
Here is the evaluation of the given determinant.
Given:
The determinant $\Delta$ is given by:
$\Delta = \begin{vmatrix} 3& 2& 3 \\ 2& 2& 3 \\ 3& 2& 3 \end{vmatrix}$
To Evaluate:
The value of $\Delta$.
Solution:
We observe that the first column (C1) and the third column (C3) of the determinant are identical:
C1 = $\begin{pmatrix} 3 \\ 2 \\ 3 \end{pmatrix}$ and C3 = $\begin{pmatrix} 3 \\ 3 \\ 3 \end{pmatrix}$
Correction: The elements in C1 and C3 are:
C1 = $\begin{pmatrix} 3 \\ 2 \\ 3 \end{pmatrix}$ and C3 = $\begin{pmatrix} 3 \\ 3 \\ 3 \end{pmatrix}$
Let's re-check the determinant elements provided in the question:
$\Delta = \begin{vmatrix} 3& 2& 3 \\ 2& 2& 3 \\ 3& 2& 3 \end{vmatrix}$
The first column is C1 = $\begin{pmatrix} 3 \\ 2 \\ 3 \end{pmatrix}$.
The third column is C3 = $\begin{pmatrix} 3 \\ 3 \\ 3 \end{pmatrix}$.
Ah, there seems to be a typo in the question's display of the third column or my interpretation. Let's assume the matrix elements are correct as given in the question's latex:
$\begin{vmatrix} 2&−3&5\\6&0&4\\1&5&−7 \end{vmatrix}$ for previous examples.
For this example: $\begin{vmatrix} 3&2&3\\2&2&3\\3&2&3 \end{vmatrix}$
Here, C1 = $\begin{pmatrix} 3 \\ 2 \\ 3 \end{pmatrix}$ and C3 = $\begin{pmatrix} 3 \\ 3 \\ 3 \end{pmatrix}$. They are not identical.
Let's check the rows.
R1 = $\begin{pmatrix} 3 & 2 & 3 \end{pmatrix}$
R2 = $\begin{pmatrix} 2 & 2 & 3 \end{pmatrix}$
R3 = $\begin{pmatrix} 3 & 2 & 3 \end{pmatrix}$
We can see that the first row (R1) and the third row (R3) are identical.
R1 = R3 = $\begin{pmatrix} 3 & 2 & 3 \end{pmatrix}$
According to Property 3 of determinants:
If any two rows (or columns) of a determinant are identical, then the value of the determinant is zero.
Since R1 and R3 are identical in the given determinant $\Delta$, its value is 0.
$\Delta = 0$
(Because R1 = R3)
Alternatively, we can evaluate the determinant by expanding along any row or column. Let's expand along the first row (R1):
$\Delta = 3 \begin{vmatrix} 2 & 3 \\ 2 & 3 \end{vmatrix} - 2 \begin{vmatrix} 2 & 3 \\ 3 & 3 \end{vmatrix} + 3 \begin{vmatrix} 2 & 2 \\ 3 & 2 \end{vmatrix}$
$\Delta = 3 ((2)(3) - (3)(2)) - 2 ((2)(3) - (3)(3)) + 3 ((2)(2) - (2)(3))$
$\Delta = 3 (6 - 6) - 2 (6 - 9) + 3 (4 - 6)$
$\Delta = 3 (0) - 2 (-3) + 3 (-2)$
$\Delta = 0 + 6 - 6$
$\Delta = 0$
Both methods yield the same result. The value of the determinant is 0.
Example 9: Evaluate $\begin{vmatrix} 102&18&36\\1&3&4\\17&3&6 \end{vmatrix}$
Answer:
Here is the evaluation of the given determinant.
Given:
The determinant $\Delta$ is given by:
$\Delta = \begin{vmatrix} 102& 18& 36 \\ 1& 3& 4 \\ 17& 3& 6 \end{vmatrix}$
To Evaluate:
The value of $\Delta$.
Solution:
We examine the rows and columns of the determinant for any specific properties.
Consider the first row (R1) and the third row (R3):
R1 = $\begin{pmatrix} 102 & 18 & 36 \end{pmatrix}$
R3 = $\begin{pmatrix} 17 & 3 & 6 \end{pmatrix}$
Let's check if R1 is a multiple of R3 by comparing the ratios of corresponding elements:
Ratio of first elements: $\frac{102}{17} = 6$
Ratio of second elements: $\frac{18}{3} = 6$
Ratio of third elements: $\frac{36}{6} = 6$
Since the ratio of corresponding elements is constant (equal to 6), the first row (R1) is proportional to the third row (R3). Specifically, R1 = $6 \times$ R3.
According to Property 3 of determinants (extended version):
If any two rows (or columns) of a determinant are proportional, then the value of the determinant is zero.
Since R1 and R3 are proportional in the given determinant $\Delta$, its value is 0.
$\Delta = 0$
(Because R1 is proportional to R3)
Verification by Expansion:
We can also evaluate the determinant by expanding along any row or column to verify the result. Let's expand along the first row (R1):
$\Delta = 102 \begin{vmatrix} 3 & 4 \\ 3 & 6 \end{vmatrix} - 18 \begin{vmatrix} 1 & 4 \\ 17 & 6 \end{vmatrix} + 36 \begin{vmatrix} 1 & 3 \\ 17 & 3 \end{vmatrix}$
$\Delta = 102 ((3)(6) - (4)(3)) - 18 ((1)(6) - (4)(17)) + 36 ((1)(3) - (3)(17))$
$\Delta = 102 (18 - 12) - 18 (6 - 68) + 36 (3 - 51)$
$\Delta = 102 (6) - 18 (-62) + 36 (-48)$
$\Delta = 612 + 1116 - 1728$
$\Delta = 1728 - 1728$
$\Delta = 0$
Both the property method and the expansion method confirm that the value of the determinant is 0.
Example 10: Show that $\begin{vmatrix} a&b&c\\a+2x&b+2y&c+2z\\x&y&z \end{vmatrix} = 0$
Answer:
Here is the step-by-step process to show that the given determinant is equal to 0.
Given:
The determinant $\Delta = \begin{vmatrix} a& b& c \\ a+2x& b+2y& c+2z \\ x& y& z \end{vmatrix}$
To Show:
$\Delta = 0$
Solution:
We will use the properties of determinants to evaluate the given determinant.
The second row (R2) of the determinant is a sum of two terms. Using Property 5, which states that if some or all elements of a row or column of a determinant are expressed as sum of two (or more) terms, then the determinant can be expressed as sum of two (or more) determinants, we can split the determinant:
$\Delta = \begin{vmatrix} a& b& c \\ a& b& c \\ x& y& z \end{vmatrix} + \begin{vmatrix} a& b& c \\ 2x& 2y& 2z \\ x& y& z \end{vmatrix}$
Let the first determinant be $\Delta_1$ and the second determinant be $\Delta_2$.
$\Delta_1 = \begin{vmatrix} a& b& c \\ a& b& c \\ x& y& z \end{vmatrix}$
In $\Delta_1$, the first row (R1) and the second row (R2) are identical.
By Property 3, if any two rows (or columns) of a determinant are identical, then the value of the determinant is zero.
Thus, $\Delta_1 = 0$.
Now consider the second determinant:
$\Delta_2 = \begin{vmatrix} a& b& c \\ 2x& 2y& 2z \\ x& y& z \end{vmatrix}$
In $\Delta_2$, the second row (R2) has a common factor of 2 in all its elements. Using Property 4, which states that if each element of a row (or a column) of a determinant is multiplied by a constant k, then its value gets multiplied by k, we can take out the common factor 2 from R2:
$\Delta_2 = 2 \begin{vmatrix} a& b& c \\ x& y& z \\ x& y& z \end{vmatrix}$
Now, examine the determinant $ \begin{vmatrix} a& b& c \\ x& y& z \\ x& y& z \end{vmatrix} $. In this determinant, the second row (R2) and the third row (R3) are identical.
By Property 3, if any two rows (or columns) of a determinant are identical, then the value of the determinant is zero.
So, $ \begin{vmatrix} a& b& c \\ x& y& z \\ x& y& z \end{vmatrix} = 0 $.
Substituting this back into the expression for $\Delta_2$:
$\Delta_2 = 2 \times 0 = 0$
Finally, substituting the values of $\Delta_1$ and $\Delta_2$ back into the expression for $\Delta$:
$\Delta = \Delta_1 + \Delta_2$
$\Delta = 0 + 0$
$\Delta = 0$
Thus, it is shown that $\begin{vmatrix} a& b& c \\ a+2x& b+2y& c+2z \\ x& y& z \end{vmatrix} = 0$.
Example 11: Prove that $\begin{vmatrix} a&a+b&a+b+c\\2a&3a+2b&4a+3b+2c\\3a&6a+3b&10a+6b+3c \end{vmatrix} = a^3$.
Answer:
Here is the proof that the given determinant is equal to $a^3$.
Given:
The determinant $\Delta = \begin{vmatrix} a& a+b& a+b+c \\ 2a& 3a+2b& 4a+3b+2c \\ 3a& 6a+3b& 10a+6b+3c \end{vmatrix}$
To Prove:
$\Delta = a^3$
Proof:
We will use elementary row operations to simplify the determinant. Applying elementary row operations of the form $R_i \to R_i + kR_j$ does not change the value of the determinant (Property 6).
Apply the operation $R_2 \to R_2 - 2R_1$:
New R2 (1,1): $2a - 2(a) = 0$
New R2 (1,2): $(3a+2b) - 2(a+b) = 3a+2b - 2a - 2b = a$
New R2 (1,3): $(4a+3b+2c) - 2(a+b+c) = 4a+3b+2c - 2a - 2b - 2c = 2a+b$
The determinant becomes:
$\Delta = \begin{vmatrix} a& a+b& a+b+c \\ 0& a& 2a+b \\ 3a& 6a+3b& 10a+6b+3c \end{vmatrix}$
Now, apply the operation $R_3 \to R_3 - 3R_1$:
New R3 (1,1): $3a - 3(a) = 0$
New R3 (1,2): $(6a+3b) - 3(a+b) = 6a+3b - 3a - 3b = 3a$
New R3 (1,3): $(10a+6b+3c) - 3(a+b+c) = 10a+6b+3c - 3a - 3b - 3c = 7a+3b$
The determinant becomes:
$\Delta = \begin{vmatrix} a& a+b& a+b+c \\ 0& a& 2a+b \\ 0& 3a& 7a+3b \end{vmatrix}$
Now, we can expand the determinant along the first column (C1) as it contains two zeros:
$\Delta = a \times (\text{Minor of element a in R1, C1}) - 0 \times (\text{Minor of element 0 in R2, C1}) + 0 \times (\text{Minor of element 0 in R3, C1})$
$\Delta = a \begin{vmatrix} a& 2a+b \\ 3a& 7a+3b \end{vmatrix} - 0 + 0$
Evaluate the $2 \times 2$ determinant:
$\begin{vmatrix} a& 2a+b \\ 3a& 7a+3b \end{vmatrix} = (a)(7a+3b) - (2a+b)(3a)$
$ = 7a^2 + 3ab - (6a^2 + 3ab)$
$ = 7a^2 + 3ab - 6a^2 - 3ab$
$ = (7a^2 - 6a^2) + (3ab - 3ab)$
$ = a^2 + 0$
$ = a^2$
Substituting this back into the expression for $\Delta$:
$\Delta = a \times (a^2)$
$\Delta = a^3$
Thus, it is proved that $\begin{vmatrix} a& a+b& a+b+c \\ 2a& 3a+2b& 4a+3b+2c \\ 3a& 6a+3b& 10a+6b+3c \end{vmatrix} = a^3$.
Example 12: Without expanding, prove that
∆ = $\begin{vmatrix} x+y&y+z&z+x\\z&x&y\\1&1&1 \end{vmatrix} = 0$ .
Answer:
Here is the proof that the given determinant is equal to 0 without expanding it.
Given:
The determinant $\Delta = \begin{vmatrix} x+y& y+z& z+x \\ z& x& y \\ 1& 1& 1 \end{vmatrix}$
To Prove:
$\Delta = 0$ (without expanding)
Proof:
We will use elementary row operations and properties of determinants.
Apply the row operation $R_1 \to R_1 + R_2$. This operation does not change the value of the determinant (Property 6).
The new first row will be:
R1 (new element 1): $(x+y) + z = x+y+z$
R1 (new element 2): $(y+z) + x = x+y+z$
R1 (new element 3): $(z+x) + y = x+y+z$
So the determinant becomes:
$\Delta = \begin{vmatrix} x+y+z& x+y+z& x+y+z \\ z& x& y \\ 1& 1& 1 \end{vmatrix}$
Now, we can observe that the first row (R1) has a common factor of $(x+y+z)$ in all its elements.
Using Property 4, which states that if each element of a row (or a column) of a determinant is multiplied by a constant k, then its value gets multiplied by k, we can take out the common factor $(x+y+z)$ from R1:
$\Delta = (x+y+z) \begin{vmatrix} 1& 1& 1 \\ z& x& y \\ 1& 1& 1 \end{vmatrix}$
Now, examine the determinant $ \begin{vmatrix} 1& 1& 1 \\ z& x& y \\ 1& 1& 1 \end{vmatrix} $.
In this determinant, the first row (R1) and the third row (R3) are identical.
R1 = $\begin{pmatrix} 1 & 1 & 1 \end{pmatrix}$
R3 = $\begin{pmatrix} 1 & 1 & 1 \end{pmatrix}$
According to Property 3, if any two rows (or columns) of a determinant are identical, then the value of the determinant is zero.
So, $ \begin{vmatrix} 1& 1& 1 \\ z& x& y \\ 1& 1& 1 \end{vmatrix} = 0 $.
Substituting this back into the expression for $\Delta$:
$\Delta = (x+y+z) \times 0$
$\Delta = 0$
Thus, it is proved that $\begin{vmatrix} x+y& y+z& z+x \\ z& x& y \\ 1& 1& 1 \end{vmatrix} = 0$ without expanding the determinant.
Example 13: Evaluate
∆ = $\begin{vmatrix} 1&a&bc\\1&b&ca\\1&c&ab \end{vmatrix}$
Answer:
Here is the evaluation of the given determinant using properties of determinants.
Given:
The determinant $\Delta = \begin{vmatrix} 1& a& bc \\ 1& b& ca \\ 1& c& ab \end{vmatrix}$
To Evaluate:
The value of $\Delta$.
Solution:
We will use elementary row operations to simplify the determinant. Applying elementary row operations of the form $R_i \to R_i - R_j$ does not change the value of the determinant (Property 6).
Apply the operation $R_2 \to R_2 - R_1$:
New R2 (1,1): $1 - 1 = 0$
New R2 (1,2): $b - a$
New R2 (1,3): $ca - bc = c(a - b)$
The determinant becomes:
$\Delta = \begin{vmatrix} 1& a& bc\\0& b-a& c(a-b)\\1& c& ab \end{vmatrix}$
Now, apply the operation $R_3 \to R_3 - R_1$:
New R3 (1,1): $1 - 1 = 0$
New R3 (1,2): $c - a$
New R3 (1,3): $ab - bc = b(a - c)$
The determinant becomes:
$\Delta = \begin{vmatrix} 1& a& bc\\0& b-a& c(a-b)\\0& c-a& b(a-c) \end{vmatrix}$
Now, expand the determinant along the first column (C1) as it contains two zeros:
$\Delta = 1 \times \begin{vmatrix} b-a& c(a-b)\\c-a& b(a-c) \end{vmatrix} - 0 \times (\text{minor}) + 0 \times (\text{minor})$
$\Delta = \begin{vmatrix} b-a& c(a-b)\\c-a& b(a-c) \end{vmatrix}$
Evaluate the $2 \times 2$ determinant:
$\Delta = (b-a) \times b(a-c) - c(a-b) \times (c-a)$
We can write $a-b = -(b-a)$ and $a-c = -(c-a)$. Substitute these into the expression:
$\Delta = (b-a) b (-(c-a)) - c (-(b-a)) (c-a)$
$\Delta = -(b-a) b (c-a) + c (b-a) (c-a)$
Now, factor out the common term $(b-a)(c-a)$:
$\Delta = (b-a)(c-a) [-b + c]$
$\Delta = (b-a)(c-a)(c-b)$
This result can also be written in the standard cyclic form $(a-b)(b-c)(c-a)$.
$(b-a) = -(a-b)$
$(c-a) = -(a-c)$
$(c-b) = -(b-c)$
$\Delta = (-(a-b)) (-(a-c)) (-(b-c))$
$\Delta = (-1)^3 (a-b)(a-c)(b-c)$
$\Delta = -(a-b)(a-c)(b-c)$
Since $(a-c) = -(c-a)$ and $(b-c) = -(c-b)$, we have $(a-c)(b-c) = (-(c-a)) (-(c-b)) = (c-a)(c-b)$.
Wait, let's check again. $(b-a)(c-a)(c-b) = -(a-b) \times -(a-c) \times -(b-c) = -(a-b)(a-c)(b-c)$.
And $(a-b)(b-c)(c-a) = (a-b)(b-c)(-(a-c)) = -(a-b)(b-c)(a-c)$.
The forms are equivalent.
So the value of the determinant is $(b-a)(c-a)(c-b)$, which is equal to $(a-b)(b-c)(c-a)$.
The value of the determinant is $(a-b)(b-c)(c-a)$.
Example 14: Prove that $\begin{vmatrix} b+c&a&a\\b&c+a&b\\c&c&a+b \end{vmatrix} = 4abc$ .
Answer:
Here is the proof that the given determinant is equal to $4abc$.
Given:
The determinant $\Delta = \begin{vmatrix} b+c& a& a\\b& c+a& b\\c& c& a+b \end{vmatrix}$
To Prove:
$\Delta = 4abc$
Proof:
We will use elementary row operations to simplify the determinant. Applying elementary row operations of the form $R_i \to R_i + kR_j$ does not change the value of the determinant (Property 6).
Apply the operation $R_1 \to R_1 - (R_2 + R_3)$:
New R1 (1,1): $(b+c) - (b+c) = 0$
New R1 (1,2): $a - (c+a + c) = a - c - a - c = -2c$
New R1 (1,3): $a - (b + a+b) = a - b - a - b = -2b$
The determinant becomes:
$\Delta = \begin{vmatrix} 0& -2c& -2b\\b& c+a& b\\c& c& a+b \end{vmatrix}$
Now, expand the determinant along the first row (R1) as it contains a zero in the first position. The terms with a multiplier of 0 will vanish.
$\Delta = 0 \cdot (\text{Minor of element 0 in R1, C1}) - (-2c) \cdot (\text{Minor of element -2c in R1, C2}) + (-2b) \cdot (\text{Minor of element -2b in R1, C3})$
$\Delta = 0 - (-2c) \begin{vmatrix} b& b \\ c& a+b \end{vmatrix} + (-2b) \begin{vmatrix} b& c+a \\ c& c \end{vmatrix}$
$\Delta = 2c \begin{vmatrix} b& b \\ c& a+b \end{vmatrix} - 2b \begin{vmatrix} b& c+a \\ c& c \end{vmatrix}$
Evaluate the $2 \times 2$ determinants:
$\begin{vmatrix} b& b \\ c& a+b \end{vmatrix} = (b)(a+b) - (b)(c) = ab + b^2 - bc$
$\begin{vmatrix} b& c+a \\ c& c \end{vmatrix} = (b)(c) - (c+a)(c) = bc - (c^2 + ac) = bc - c^2 - ac$
Substitute these values back into the expression for $\Delta$:
$\Delta = 2c (ab + b^2 - bc) - 2b (bc - c^2 - ac)$
$\Delta = 2abc + 2b^2c - 2bc^2 - 2b^2c + 2bc^2 + 2abc$
Combine like terms:
$\Delta = (2abc + 2abc) + (2b^2c - 2b^2c) + (-2bc^2 + 2bc^2)$
$\Delta = 4abc + 0 + 0$
$\Delta = 4abc$
Thus, it is proved that $\begin{vmatrix} b+c& a& a\\b& c+a& b\\c& c& a+b \end{vmatrix} = 4abc$.
Example 15: If x, y, z are different and ∆ = $\begin{vmatrix} x&x^2&1+x^3\\y&y^2&1+y^3\\z&z^2&1+z^3 \end{vmatrix} = 0$ , then show that 1 + xyz = 0
Answer:
Here is the proof to show that $1 + xyz = 0$ given the condition.
Given:
x, y, z are different (i.e., $x \neq y$, $y \neq z$, $z \neq x$).
The determinant $\Delta = \begin{vmatrix} x& x^2& 1+x^3\\y& y^2& 1+y^3\\z& z^2& 1+z^3 \end{vmatrix} = 0$
To Show:
$1 + xyz = 0$
Proof:
We are given the determinant $\Delta = \begin{vmatrix} x& x^2& 1+x^3\\y& y^2& 1+y^3\\z& z^2& 1+z^3 \end{vmatrix}$.
Using Property 5, we can write the determinant as the sum of two determinants because the elements of the third column (C3) are sums of two terms:
$\Delta = \begin{vmatrix} x& x^2& 1\\y& y^2& 1\\z& z^2& 1 \end{vmatrix} + \begin{vmatrix} x& x^2& x^3\\y& y^2& y^3\\z& z^2& z^3 \end{vmatrix}$
Let $\Delta_1 = \begin{vmatrix} x& x^2& 1\\y& y^2& 1\\z& z^2& 1 \end{vmatrix}$ and $\Delta_2 = \begin{vmatrix} x& x^2& x^3\\y& y^2& y^3\\z& z^2& z^3 \end{vmatrix}$.
So, $\Delta = \Delta_1 + \Delta_2$.
Consider $\Delta_1 = \begin{vmatrix} x& x^2& 1\\y& y^2& 1\\z& z^2& 1 \end{vmatrix}$.
Apply column operations $C_2 \leftrightarrow C_3$. This changes the sign of the determinant (Property 2).
$\Delta_1 = - \begin{vmatrix} x& 1& x^2\\y& 1& y^2\\z& 1& z^2 \end{vmatrix}$
Apply column operations $C_1 \leftrightarrow C_2$. This changes the sign of the determinant again (Property 2).
$\Delta_1 = - \left( - \begin{vmatrix} 1& x& x^2\\1& y& y^2\\1& z& z^2 \end{vmatrix} \right) = \begin{vmatrix} 1& x& x^2\\1& y& y^2\\1& z& z^2 \end{vmatrix}$
This is a standard Vandermonde determinant, whose value is $(y-x)(z-y)(z-x)$.
We can also write this as $(x-y)(y-z)(z-x)$ by changing the signs:
$(y-x)(z-y)(z-x) = (-(x-y))(-(y-z))(-(x-z)) = (-1)^3 (x-y)(y-z)(x-z) = -(x-y)(y-z)(x-z)$.
Let's evaluate $\Delta_1$ by expanding along R1:
$\Delta_1 = x(y^2 \cdot 1 - 1 \cdot z^2) - x^2(y \cdot 1 - 1 \cdot z) + 1(y \cdot z^2 - y^2 \cdot z)$
$\Delta_1 = x(y^2 - z^2) - x^2(y - z) + yz(z - y)$
$\Delta_1 = x(y-z)(y+z) - x^2(y-z) - yz(y-z)$
Factor out $(y-z)$:
$\Delta_1 = (y-z) [x(y+z) - x^2 - yz]$
$\Delta_1 = (y-z) [xy + xz - x^2 - yz]$
$\Delta_1 = (y-z) [y(x - z) - x(x - z)]$
$\Delta_1 = (y-z) (y - x) (x - z)$
We can rewrite this as $\Delta_1 = (x-y)(z-y)(z-x)$. Or equivalently, $\Delta_1 = -(x-y)(y-z)(z-x)$.
Let's stick with $\Delta_1 = (x-y)(y-z)(z-x)$ by adjusting the order of factors.
$(y-z)(y-x)(x-z) = (y-z)(-(x-y))(-(z-x)) = (y-z)(x-y)(z-x)$. Order matters for cyclic form. Let's keep $\Delta_1 = (y-z)(y-x)(x-z)$.
Now consider $\Delta_2 = \begin{vmatrix} x& x^2& x^3\\y& y^2& y^3\\z& z^2& z^3 \end{vmatrix}$.
Take out the common factor $x$ from R1, $y$ from R2, and $z$ from R3 (Property 4).
$\Delta_2 = xyz \begin{vmatrix} 1& x& x^2\\1& y& y^2\\1& z& z^2 \end{vmatrix}$
The determinant $\begin{vmatrix} 1& x& x^2\\1& y& y^2\\1& z& z^2 \end{vmatrix}$ is the Vandermonde determinant with value $(y-x)(z-y)(z-x)$.
So, $\Delta_2 = xyz (y-x)(z-y)(z-x)$.
We have $\Delta = \Delta_1 + \Delta_2$ and $\Delta = 0$.
$(y-z)(y-x)(x-z) + xyz (y-x)(z-y)(z-x) = 0$
Factor out the common terms from both parts.
Notice that $(y-x)(x-z) = -(x-y)(z-x)$
$\Delta_1 = (y-z)(y-x)(x-z)$
$\Delta_2 = xyz (y-x)(z-y)(z-x) = xyz (y-x) (-(y-z)) (z-x) = -xyz (y-x)(y-z)(z-x)$
So, $(y-z)(y-x)(x-z) - xyz (y-x)(y-z)(z-x) = 0$
Factor out the common term $(y-z)(y-x)(z-x)$:
$(y-z)(y-x)(z-x) [1 - xyz] = 0$
Wait, checking the signs in $\Delta_2$: $\Delta_2 = xyz (y-x)(z-y)(z-x)$ $\Delta_1 = (y-z)(y-x)(x-z)$ Let's rewrite $\Delta_1$ in terms of $(y-x)$, $(z-x)$, $(z-y)$. $(y-z)(y-x)(x-z) = -(z-y) (y-x) -(z-x) = (z-y)(y-x)(z-x)$ ? No. $(y-z)(y-x)(x-z) = (y-z)(y-x)(-(z-x)) = -(y-z)(y-x)(z-x)$.
So, $\Delta = -(y-z)(y-x)(z-x) + xyz (y-x)(z-y)(z-x) = 0$.
Factor out $(y-x)(z-x)$:
$(y-x)(z-x) [-(y-z) + xyz (z-y)] = 0$
$(y-x)(z-x) [(z-y) + xyz (z-y)]$ ? No. $(y-z) = -(z-y)$. $(y-x)(z-x) [-(-(z-y)) + xyz (z-y)] = 0$ $(y-x)(z-x) [(z-y) + xyz (z-y)] = 0$ $(y-x)(z-x)(z-y) [1 + xyz] = 0$
We are given that x, y, z are different.
This means $x \neq y$, $y \neq z$, and $z \neq x$.
Therefore, $(y-x) \neq 0$, $(z-x) \neq 0$, and $(z-y) \neq 0$.
The product $(y-x)(z-x)(z-y) \neq 0$.
Since the product of $(y-x)(z-x)(z-y)$ and $(1 + xyz)$ is 0, and $(y-x)(z-x)(z-y)$ is not 0, it must be that the other factor is 0.
So, $1 + xyz = 0$.
This concludes the proof.
Example 16: Show that
$\begin{vmatrix} 1+a&1&1\\1&1+b&1\\1&1&1+c \end{vmatrix}$ = $abc \left(1+\frac{1}{a} +\frac{1}{b}+\frac{1}{c} \right)$ = $abc + bc + ca + ab$
Answer:
Here is the proof to show the equality for the given determinant.
Given:
The determinant $\Delta = \begin{vmatrix} 1+a& 1& 1\\1& 1+b& 1\\1& 1& 1+c \end{vmatrix}$
To Show:
$\Delta = abc \left(1+\frac{1}{a} +\frac{1}{b}+\frac{1}{c} \right) = abc + bc + ca + ab$
Proof:
We will evaluate the determinant using elementary row operations. Applying elementary row operations of the form $R_i \to R_i - R_j$ does not change the value of the determinant (Property 6).
Consider the determinant:
$\Delta = \begin{vmatrix} 1+a& 1& 1\\1& 1+b& 1\\1& 1& 1+c \end{vmatrix}$
Apply the operation $R_1 \to R_1 - R_2$:
R1 elements change as:
(1+a) - 1 = a
1 - (1+b) = -b
1 - 1 = 0
The determinant becomes:
$\Delta = \begin{vmatrix} a& -b& 0\\1& 1+b& 1\\1& 1& 1+c \end{vmatrix}$
Now, apply the operation $R_2 \to R_2 - R_3$:
R2 elements change as:
1 - 1 = 0
(1+b) - 1 = b
1 - (1+c) = -c
The determinant becomes:
$\Delta = \begin{vmatrix} a& -b& 0\\0& b& -c\\1& 1& 1+c \end{vmatrix}$
Now, expand the determinant along the first column (C1) as it contains two zeros:
$\Delta = a \times (\text{Minor of element a in R1, C1}) - 0 \times (\text{Minor of element 0 in R2, C1}) + 1 \times (\text{Minor of element 1 in R3, C1})$
$\Delta = a \begin{vmatrix} b& -c\\1& 1+c \end{vmatrix} - 0 + 1 \begin{vmatrix} -b& 0\\b& -c \end{vmatrix}$
Evaluate the $2 \times 2$ determinants:
$\begin{vmatrix} b& -c\\1& 1+c \end{vmatrix} = (b)(1+c) - (-c)(1) = b + bc + c$
$\begin{vmatrix} -b& 0\\b& -c \end{vmatrix} = (-b)(-c) - (0)(b) = bc - 0 = bc$
Substitute these values back into the expression for $\Delta$:
$\Delta = a (b + bc + c) + 1 (bc)$
$\Delta = ab + abc + ac + bc$
Thus, we have shown that:
$\Delta = ab + bc + ca + abc$
... (i)
Now, let's show that the first expression is equal to the second expression. Consider the expression $abc \left(1+\frac{1}{a} +\frac{1}{b}+\frac{1}{c} \right)$. Assume $a, b, c \neq 0$.
$abc \left(1+\frac{1}{a} +\frac{1}{b}+\frac{1}{c} \right) = abc \times 1 + abc \times \frac{1}{a} + abc \times \frac{1}{b} + abc \times \frac{1}{c}$
$ = abc + \frac{abc}{a} + \frac{abc}{b} + \frac{abc}{c}$
$ = abc + bc + ac + ab$
$ = abc + bc + ca + ab$
This matches the result obtained from evaluating the determinant in equation (i).
$abc \left(1+\frac{1}{a} +\frac{1}{b}+\frac{1}{c} \right) = abc + bc + ca + ab$
... (ii)
From equations (i) and (ii), we conclude that:
$\begin{vmatrix} 1+a& 1& 1\\1& 1+b& 1\\1& 1& 1+c \end{vmatrix}$ = $abc + bc + ca + ab$ = $abc \left(1+\frac{1}{a} +\frac{1}{b}+\frac{1}{c} \right)$
Note that the expanded form $abc + bc + ca + ab$ is valid for all values of $a, b, c$, including when some are zero. The factored form $abc(1 + 1/a + 1/b + 1/c)$ is specifically useful when $a, b, c$ are non-zero, but the equality holds in the sense of polynomial identities.
Exercise 4.2
Using the property of determinants and without expanding in Exercises 1 to 7, prove that:
Question 1. $\begin{vmatrix} x&a&x+a\\y&b&y+b\\z&c&z+c \end{vmatrix} = 0$
Answer:
Here is the proof that the given determinant is equal to 0 without expanding it, using properties of determinants.
Given:
The determinant $\Delta = \begin{vmatrix} x& a& x+a\\y& b& y+b\\z& c& z+c \end{vmatrix}$
To Prove:
$\Delta = 0$ (without expanding)
Proof:
We will use elementary column operations and properties of determinants.
Consider the determinant:
$\Delta = \begin{vmatrix} x& a& x+a\\y& b& y+b\\z& c& z+c \end{vmatrix}$
Observe that the third column (C3) is the sum of the first column (C1) and the second column (C2).
Apply the column operation $C_3 \to C_3 - C_1$. This operation does not change the value of the determinant (Property 6).
New C3 elements are calculated as:
(x + a) - x = a
($y + b$) - y = b
($z + c$) - z = c
After applying the operation, the determinant becomes:
$\Delta = \begin{vmatrix} x& a& a\\y& b& b\\z& c& c \end{vmatrix}$
Now, examine the columns of this resulting determinant:
C1 = $\begin{pmatrix} x \\ y \\ z \end{pmatrix}$
C2 = $\begin{pmatrix} a \\ b \\ c \end{pmatrix}$
C3 = $\begin{pmatrix} a \\ b \\ c \end{pmatrix}$
We can see that the second column (C2) and the third column (C3) are identical.
C2 = C3
According to Property 3 of determinants, if any two columns (or rows) of a determinant are identical, then the value of the determinant is zero.
Since the columns C2 and C3 are identical in the determinant, its value is 0.
$\Delta = 0$
(Because C2 = C3)
Thus, it is proved that $\begin{vmatrix} x& a& x+a\\y& b& y+b\\z& c& z+c \end{vmatrix} = 0$ without expanding the determinant.
Question 2. $\begin{vmatrix} a−b&b−c&c−a\\b−c&c−a&a−b\\c−a&a−b&b−c \end{vmatrix} = 0$
Answer:
Here is the proof that the given determinant is equal to 0 without expanding it, using properties of determinants.
Given:
The determinant $\Delta = \begin{vmatrix} a−b& b−c& c−a\\b−c& c−a& a−b\\c−a& a−b& b−c \end{vmatrix}$
To Prove:
$\Delta = 0$ (without expanding)
Proof:
We will use elementary row operations and properties of determinants. Applying an elementary row operation of the form $R_i \to R_i + kR_j$ does not change the value of the determinant (Property 6).
Consider the determinant:
$\Delta = \begin{vmatrix} a-b& b-c& c-a\\b-c& c-a& a-b\\c-a& a-b& b-c \end{vmatrix}$
Apply the operation $R_1 \to R_1 + R_2 + R_3$. This operation adds the elements of R2 and R3 to the corresponding elements of R1. The value of the determinant remains unchanged.
Let's compute the new elements of R1:
New R1 (1,1): $(a-b) + (b-c) + (c-a) = a-b+b-c+c-a = 0$
New R1 (1,2): $(b-c) + (c-a) + (a-b) = b-c+c-a+a-b = 0$
New R1 (1,3): $(c-a) + (a-b) + (b-c) = c-a+a-b+b-c = 0$
After applying the operation $R_1 \to R_1 + R_2 + R_3$, the determinant becomes:
$\Delta = \begin{vmatrix} 0& 0& 0\\b-c& c-a& a-b\\c-a& a-b& b-c \end{vmatrix}$
Now, examine the first row (R1) of this resulting determinant:
R1 = $\begin{pmatrix} 0 & 0 & 0 \end{pmatrix}$
According to a property of determinants, if all the elements of any one row (or any one column) are zero, then the value of the determinant is zero.
Since all elements in the first row (R1) are 0, the value of the determinant is 0.
$\Delta = 0$
(Because R1 has all elements as 0)
Thus, it is proved that $\begin{vmatrix} a−b& b−c& c−a\\b−c& c−a& a−b\\c−a& a−b& b−c \end{vmatrix} = 0$ without expanding the determinant.
Question 3. $\begin{vmatrix} 2&7&65\\3&8&75\\5&9&86 \end{vmatrix} = 0$
Answer:
Here is the proof that the given determinant is equal to 0 without expanding it, using properties of determinants.
Given:
The determinant $\Delta = \begin{vmatrix} 2& 7& 65\\3& 8& 75\\5& 9& 86 \end{vmatrix}$
To Prove:
$\Delta = 0$ (without expanding)
Proof:
We will use elementary column operations and properties of determinants. Applying an elementary column operation of the form $C_i \to C_i + kC_j$ does not change the value of the determinant (Property 6).
Consider the determinant:
$\Delta = \begin{vmatrix} 2& 7& 65\\3& 8& 75\\5& 9& 86 \end{vmatrix}$
Let's examine the relationship between the columns. Let the columns be C1, C2, and C3.
C1 = $\begin{pmatrix} 2 \\ 3 \\ 5 \end{pmatrix}$, C2 = $\begin{pmatrix} 7 \\ 8 \\ 9 \end{pmatrix}$, C3 = $\begin{pmatrix} 65 \\ 75 \\ 86 \end{pmatrix}$
Let's try to express C3 as a linear combination of C1 and C2, say $k_1 \times$ C1 + $k_2 \times$ C2 = C3.
For the first row: $k_1 \times 2 + k_2 \times 7 = 65$
For the second row: $k_1 \times 3 + k_2 \times 8 = 75$
For the third row: $k_1 \times 5 + k_2 \times 9 = 86$
By observation or solving a system of equations from the first two rows:
Let's try $k_2 = 9$.
From row 1: $2k_1 + 9 \times 7 = 65 \implies 2k_1 + 63 = 65 \implies 2k_1 = 2 \implies k_1 = 1$.
Check if $k_1=1$ and $k_2=9$ satisfy the other rows:
Row 2: $1 \times 3 + 9 \times 8 = 3 + 72 = 75$. This matches the element in C3.
Row 3: $1 \times 5 + 9 \times 9 = 5 + 81 = 86$. This matches the element in C3.
So, the relationship is C3 = 1 $\times$ C1 + 9 $\times$ C2, or C3 = C1 + 9C2.
Now, apply the column operation $C_3 \to C_3 - C_1 - 9C_2$. This operation is of the form $C_i \to C_i + k_1 C_j + k_2 C_l$, which is equivalent to successive applications of $C_i \to C_i + kC_j$ and does not change the value of the determinant (Property 6 extended).
Let's compute the new elements of C3:
New C3 (1,3): $65 - 2 - 9 \times 7 = 65 - 2 - 63 = 65 - 65 = 0$
New C3 (2,3): $75 - 3 - 9 \times 8 = 75 - 3 - 72 = 75 - 75 = 0$
New C3 (3,3): $86 - 5 - 9 \times 9 = 86 - 5 - 81 = 86 - 86 = 0$
After applying the operation $C_3 \to C_3 - C_1 - 9C_2$, the determinant becomes:
$\Delta = \begin{vmatrix} 2& 7& 0\\3& 8& 0\\5& 9& 0 \end{vmatrix}$
Now, examine the third column (C3) of this resulting determinant:
C3 = $\begin{pmatrix} 0 \\ 0 \\ 0 \end{pmatrix}$
According to a property of determinants, if all the elements of any one column (or any one row) are zero, then the value of the determinant is zero.
Since all elements in the third column (C3) are 0, the value of the determinant is 0.
$\Delta = 0$
(Because C3 has all elements as 0)
Thus, it is proved that $\begin{vmatrix} 2& 7& 65\\3& 8& 75\\5& 9& 86 \end{vmatrix} = 0$ without expanding the determinant.
Question 4. $\begin{vmatrix} 1&bc&a(b+c)\\1&ca&b(c+a)\\1&ab&c(a+b) \end{vmatrix} = 0$
Answer:
Here is the proof that the given determinant is equal to 0 without expanding it, using properties of determinants.
Given:
The determinant $\Delta = \begin{vmatrix} 1& bc& a(b+c)\\1& ca& b(c+a)\\1& ab& c(a+b) \end{vmatrix}$
To Prove:
$\Delta = 0$ (without expanding)
Proof:
We will use elementary column operations and properties of determinants. Applying an elementary column operation of the form $C_i \to C_i + kC_j$ does not change the value of the determinant (Property 6).
Consider the determinant:
$\Delta = \begin{vmatrix} 1& bc& a(b+c)\\1& ca& b(c+a)\\1& ab& c(a+b) \end{vmatrix}$
First, expand the terms in the third column (C3):
$\Delta = \begin{vmatrix} 1& bc& ab+ac\\1& ca& bc+ab\\1& ab& ca+bc \end{vmatrix}$
Apply the column operation $C_3 \to C_3 + C_2$. This operation does not change the value of the determinant.
Let's compute the new elements of C3:
New C3 (1,3): $(ab+ac) + bc = ab+bc+ca$
New C3 (2,3): $(bc+ab) + ca = ab+bc+ca$
New C3 (3,3): $(ca+bc) + ab = ab+bc+ca$
After applying the operation $C_3 \to C_3 + C_2$, the determinant becomes:
$\Delta = \begin{vmatrix} 1& bc& ab+bc+ca\\1& ca& ab+bc+ca\\1& ab& ab+bc+ca \end{vmatrix}$
Now, observe that all elements in the third column (C3) have a common factor of $(ab+bc+ca)$.
Using Property 4, which states that if each element of a column (or a row) of a determinant is multiplied by a constant k, then its value gets multiplied by k, we can take out the common factor $(ab+bc+ca)$ from C3:
$\Delta = (ab+bc+ca) \begin{vmatrix} 1& bc& 1\\1& ca& 1\\1& ab& 1 \end{vmatrix}$
Now, examine the resulting determinant $ \begin{vmatrix} 1& bc& 1\\1& ca& 1\\1& ab& 1 \end{vmatrix} $.
The first column (C1) and the third column (C3) of this determinant are identical.
C1 = C3 = $\begin{pmatrix} 1 \\ 1 \\ 1 \end{pmatrix}$
According to Property 3, if any two columns (or rows) of a determinant are identical, then the value of the determinant is zero.
Since columns C1 and C3 are identical in the determinant $ \begin{vmatrix} 1& bc& 1\\1& ca& 1\\1& ab& 1 \end{vmatrix} $, its value is 0.
Substitute this value back into the expression for $\Delta$:
$\Delta = (ab+bc+ca) \times 0$
$\Delta = 0$
Thus, it is proved that $\begin{vmatrix} 1& bc& a(b+c)\\1& ca& b(c+a)\\1& ab& c(a+b) \end{vmatrix} = 0$ without expanding the determinant.
Question 5. $\begin{vmatrix} b+c&q+r&y+z\\c+a&r+p&z+x\\a+b&p+q&x+y \end{vmatrix} = 2 \begin{vmatrix} a&p&x\\b&q&y\\c&r&z \end{vmatrix}$
Answer:
Here is the proof of the given identity using properties of determinants and without expanding the original determinant.
Given:
The left-hand side determinant $\Delta_{LHS} = \begin{vmatrix} b+c& q+r& y+z\\c+a& r+p& z+x\\a+b& p+q& x+y \end{vmatrix}$
The right-hand side is $2 \begin{vmatrix} a& p& x\\b& q& y\\c& r& z \end{vmatrix}$
To Prove:
$\Delta_{LHS} = 2 \begin{vmatrix} a& p& x\\b& q& y\\c& r& z \end{vmatrix}$ (without expanding $\Delta_{LHS}$)
Proof:
We will apply elementary row operations to the left-hand side determinant $\Delta_{LHS}$. Applying elementary row operations of the form $R_i \to R_i + kR_j$ does not change the value of the determinant (Property 6).
Consider $\Delta_{LHS} = \begin{vmatrix} b+c& q+r& y+z\\c+a& r+p& z+x\\a+b& p+q& x+y \end{vmatrix}$.
Apply the operation $R_1 \to R_1 + R_2 + R_3$:
New R1 elements are:
(b+c) + (c+a) + (a+b) = 2a + 2b + 2c = 2(a+b+c)
(q+r) + (r+p) + (p+q) = 2p + 2q + 2r = 2(p+q+r)
(y+z) + (z+x) + (x+y) = 2x + 2y + 2z = 2(x+y+z)
The determinant becomes:
$\Delta_{LHS} = \begin{vmatrix} 2(a+b+c)& 2(p+q+r)& 2(x+y+z)\\c+a& r+p& z+x\\a+b& p+q& x+y \end{vmatrix}$
Take out the common factor 2 from the first row (Property 4):
$\Delta_{LHS} = 2 \begin{vmatrix} a+b+c& p+q+r& x+y+z\\c+a& r+p& z+x\\a+b& p+q& x+y \end{vmatrix}$
Let the determinant inside the scalar 2 be $\Delta'$. So $\Delta_{LHS} = 2 \Delta'$.
$\Delta' = \begin{vmatrix} a+b+c& p+q+r& x+y+z\\c+a& r+p& z+x\\a+b& p+q& x+y \end{vmatrix}$
Now, we apply further row operations on $\Delta'$ to simplify it. These operations do not change the value of $\Delta'$.
Apply $R_1 \to R_1 - R_3$:
New R1 elements are:
(a+b+c) - (a+b) = c
(p+q+r) - (p+q) = r
(x+y+z) - (x+y) = z
$\Delta' = \begin{vmatrix} c& r& z\\c+a& r+p& z+x\\a+b& p+q& x+y \end{vmatrix}$
Apply $R_2 \to R_2 - R_1$ (using the current R1):
New R2 elements are:
(c+a) - c = a
(r+p) - r = p
(z+x) - z = x
$\Delta' = \begin{vmatrix} c& r& z\\a& p& x\\a+b& p+q& x+y \end{vmatrix}$
Apply $R_3 \to R_3 - R_2$ (using the current R2):
New R3 elements are:
(a+b) - a = b
(p+q) - p = q
(x+y) - x = y
The determinant $\Delta'$ becomes:
$\Delta' = \begin{vmatrix} c& r& z\\a& p& x\\b& q& y \end{vmatrix}$
Now, we need to rearrange the rows of this determinant to match the right-hand side determinant $\begin{vmatrix} a& p& x\\b& q& y\\c& r& z \end{vmatrix}$. Swapping two rows of a determinant changes its sign (Property 2).
Let $D_{final} = \begin{vmatrix} c& r& z\\a& p& x\\b& q& y \end{vmatrix}$.
Swap $R_1$ and $R_2$:
$D_{final} \xrightarrow{R_1 \leftrightarrow R_2} - \begin{vmatrix} a& p& x\\c& r& z\\b& q& y \end{vmatrix}$
Now, swap $R_2$ and $R_3$ in the resulting determinant:
$- \begin{vmatrix} a& p& x\\c& r& z\\b& q& y \end{vmatrix} \xrightarrow{R_2 \leftrightarrow R_3} - \left( - \begin{vmatrix} a& p& x\\b& q& y\\c& r& z \end{vmatrix} \right) = \begin{vmatrix} a& p& x\\b& q& y\\c& r& z \end{vmatrix}$
So, the value of the determinant $\begin{vmatrix} c& r& z\\a& p& x\\b& q& y \end{vmatrix}$ is equal to the value of the determinant $\begin{vmatrix} a& p& x\\b& q& y\\c& r& z \end{vmatrix}$.
This means $\Delta' = \begin{vmatrix} c& r& z\\a& p& x\\b& q& y \end{vmatrix} = \begin{vmatrix} a& p& x\\b& q& y\\c& r& z \end{vmatrix}$.
Substituting this back into the expression for $\Delta_{LHS}$:
$\Delta_{LHS} = 2 \Delta'$
$\Delta_{LHS} = 2 \begin{vmatrix} a& p& x\\b& q& y\\c& r& z \end{vmatrix}$
Thus, the identity is proved.
Question 6. $\begin{vmatrix} 0&a&−b\\−a&0&−c\\b&c&0 \end{vmatrix} = 0$ .
Answer:
Here is the proof that the given determinant is equal to 0 without expanding it, using properties of determinants.
Given:
The determinant $\Delta = \begin{vmatrix} 0& a& −b\\−a& 0& −c\\b& c& 0 \end{vmatrix}$
To Prove:
$\Delta = 0$ (without expanding)
Proof:
Let the given matrix be A:
$A = \begin{pmatrix} 0& a& -b\\-a& 0& -c\\b& c& 0 \end{pmatrix}$
The determinant is $\Delta = \det(A)$.
Consider the transpose of the matrix A, denoted by A'. The transpose is obtained by interchanging rows and columns.
$A' = \begin{pmatrix} 0& -a& b\\a& 0& c\\-b& -c& 0 \end{pmatrix}$
According to Property 1 of determinants, the value of a determinant remains unchanged if its rows and columns are interchanged (i.e., $\det(A') = \det(A)$).
$\det(A') = \Delta$
... (i)
Now, observe the relationship between the matrix A' and the original matrix A. If we take out a factor of -1 from each element of A', we get:
$A' = \begin{pmatrix} -(-0)& -(a)& -(-b)\\-(-a)& -(0)& -(-c)\\-(b)& -(c)& -(0) \end{pmatrix} = (-1) \begin{pmatrix} 0& a& -b\\-a& 0& -c\\b& c& 0 \end{pmatrix} = (-1) A$
So, $A' = -A$.
According to Property 4 (extended for a matrix), if each element of an $n \times n$ matrix A is multiplied by a constant k, then the determinant of the resulting matrix is $k^n$ times the determinant of A. In this case, $k = -1$ and the order of the matrix is $n = 3$.
So, $\det(A') = \det(-A) = (-1)^3 \det(A)$
$\det(A') = -1 \times \Delta = -\Delta$
$\det(A') = -\Delta$
... (ii)
Equating the expressions for $\det(A')$ from equations (i) and (ii):
$\Delta = -\Delta$
Adding $\Delta$ to both sides:
$\Delta + \Delta = 0$
$2\Delta = 0$
Dividing by 2:
$\Delta = 0$
Thus, it is proved that $\begin{vmatrix} 0& a& −b\\−a& 0& −c\\b& c& 0 \end{vmatrix} = 0$ without expanding the determinant.
This result is a specific case of a general property that the determinant of a skew-symmetric matrix of odd order is always zero.
Question 7. $\begin{vmatrix} −a^2&ab&ac\\ba&−b^2&bc\\ca&cb&−c^2 \end{vmatrix} = 4a^2b^2c^2$ .
Answer:
Here is the proof that the given determinant is equal to $4a^2b^2c^2$ using properties of determinants and without expanding the original determinant directly.
Given:
The determinant $\Delta = \begin{vmatrix} −a^2& ab& ac\\ba& −b^2& bc\\ca& cb& −c^2 \end{vmatrix}$
To Prove:
$\Delta = 4a^2b^2c^2$ (using properties)
Proof:
Consider the determinant:
$\Delta = \begin{vmatrix} −a^2& ab& ac\\ba& −b^2& bc\\ca& cb& −c^2 \end{vmatrix}$
We can observe that the first row (R1) has 'a' as a common factor, the second row (R2) has 'b' as a common factor, and the third row (R3) has 'c' as a common factor.
Using Property 4, which states that if each element of a row (or a column) of a determinant is multiplied by a constant k, then its value gets multiplied by k, we can take out the common factors from each row.
Taking 'a' out from R1, 'b' out from R2, and 'c' out from R3:
$\Delta = a \cdot b \cdot c \begin{vmatrix} −a& b& c\\a& −b& c\\a& b& −c \end{vmatrix}$
Let the resulting determinant be $D' = \begin{vmatrix} −a& b& c\\a& −b& c\\a& b& −c \end{vmatrix}$. So, $\Delta = abc \cdot D'$.
Now, observe that the first column (C1) of $D'$ has 'a' as a common factor, the second column (C2) has 'b' as a common factor, and the third column (C3) has 'c' as a common factor.
Using Property 4 again, we can take out the common factors from each column of $D'$.
Taking 'a' out from C1, 'b' out from C2, and 'c' out from C3:
$D' = a \cdot b \cdot c \begin{vmatrix} −1& 1& 1\\1& −1& 1\\1& 1& −1 \end{vmatrix}$
Substitute this back into the expression for $\Delta$:
$\Delta = abc \cdot (abc \begin{vmatrix} −1& 1& 1\\1& −1& 1\\1& 1& −1 \end{vmatrix})$
$\Delta = a^2b^2c^2 \begin{vmatrix} −1& 1& 1\\1& −1& 1\\1& 1& −1 \end{vmatrix}$
Let the remaining numerical determinant be $D'' = \begin{vmatrix} −1& 1& 1\\1& −1& 1\\1& 1& −1 \end{vmatrix}$.
We can evaluate $D''$ using elementary row/column operations. Apply the operation $R_1 \to R_1 + R_2$ (Property 6):
$D'' = \begin{vmatrix} -1+1& 1+(-1)& 1+1\\1& -1& 1\\1& 1& -1 \end{vmatrix} = \begin{vmatrix} 0& 0& 2\\1& -1& 1\\1& 1& −1 \end{vmatrix}$
Now, expand this determinant along the first row (R1) as it has two zeros:
$D'' = 0 \cdot (\text{Minor of element 0 in R1, C1}) - 0 \cdot (\text{Minor of element 0 in R1, C2}) + 2 \cdot (\text{Minor of element 2 in R1, C3})$
$D'' = 0 - 0 + 2 \begin{vmatrix} 1& -1 \\ 1& 1 \end{vmatrix}$
$D'' = 2 ((1)(1) - (-1)(1))$
$D'' = 2 (1 - (-1))$
$D'' = 2 (1 + 1)$
$D'' = 2 (2)$
$D'' = 4$
... (i)
Substitute the value of $D''$ from equation (i) back into the expression for $\Delta$:
$\Delta = a^2b^2c^2 \cdot D''$
$\Delta = a^2b^2c^2 \cdot 4$
$\Delta = 4a^2b^2c^2$
Thus, it is proved that $\begin{vmatrix} −a^2& ab& ac\\ba& −b^2& bc\\ca& cb& −c^2 \end{vmatrix} = 4a^2b^2c^2$.
By using properties of determinants, in Exercises 8 to 14, show that:
Question 8.
(i) $\begin{vmatrix} 1&a&a^2\\1&b&b^2\\1&c&c^2 \end{vmatrix} = (a - b) (b - c) (c - a)$
(ii) $\begin{vmatrix} 1&1&1\\a&b&c\\a^3&b^3&c^3 \end{vmatrix} = (a - b) (b - c) (c - a) (a + b + c)$
Answer:
(i)
Given: The determinant $\begin{vmatrix} 1&a&a^2\\1&b&b^2\\1&c&c^2 \end{vmatrix}$.
To Show: $\begin{vmatrix} 1&a&a^2\\1&b&b^2\\1&c&c^2 \end{vmatrix} = (a - b) (b - c) (c - a)$.
Solution:
Let $\Delta = \begin{vmatrix} 1&a&a^2\\1&b&b^2\\1&c&c^2 \end{vmatrix}$.
Apply row operations $R_2 \to R_2 - R_1$ and $R_3 \to R_3 - R_1$.
We get:
$\Delta = \begin{vmatrix} 1 & a & a^2 \\ 1-1 & b-a & b^2-a^2 \\ 1-1 & c-a & c^2-a^2 \end{vmatrix}$
$\Delta = \begin{vmatrix} 1 & a & a^2 \\ 0 & b-a & (b-a)(b+a) \\ 0 & c-a & (c-a)(c+a) \end{vmatrix}$
Taking out common factors $(b-a)$ from $R_2$ and $(c-a)$ from $R_3$:
$\Delta = (b-a)(c-a) \begin{vmatrix} 1 & a & a^2 \\ 0 & 1 & b+a \\ 0 & 1 & c+a \end{vmatrix}$
Expand the determinant along the first column ($C_1$). The cofactors of the second and third elements in the first column are zero.
$\Delta = (b-a)(c-a) \left[ 1 \begin{vmatrix} 1 & b+a \\ 1 & c+a \end{vmatrix} - 0 + 0 \right]$
$\Delta = (b-a)(c-a) [1(c+a) - 1(b+a)]$
$\Delta = (b-a)(c-a) [c+a - b-a]$
$\Delta = (b-a)(c-a) [c-b]$
Rearranging the terms to match the required form $(a-b)(b-c)(c-a)$:
We know that $(b-a) = -(a-b)$ and $(c-b) = -(b-c)$.
$\Delta = -(a-b) \cdot (c-a) \cdot -(b-c)$
$\Delta = (-1)(-1) (a-b)(b-c)(c-a)$
$\Delta = (a-b)(b-c)(c-a)$
This is the Right Hand Side (RHS).
Hence, $\begin{vmatrix} 1&a&a^2\\1&b&b^2\\1&c&c^2 \end{vmatrix} = (a - b) (b - c) (c - a)$ is proved.
(ii)
Given: The determinant $\begin{vmatrix} 1&1&1\\a&b&c\\a^3&b^3&c^3 \end{vmatrix}$.
To Show: $\begin{vmatrix} 1&1&1\\a&b&c\\a^3&b^3&c^3 \end{vmatrix} = (a - b) (b - c) (c - a) (a + b + c)$.
Solution:
Let $\Delta = \begin{vmatrix} 1&1&1\\a&b&c\\a^3&b^3&c^3 \end{vmatrix}$.
Apply column operations $C_2 \to C_2 - C_1$ and $C_3 \to C_3 - C_1$.
We get:
$\Delta = \begin{vmatrix} 1 & 1-1 & 1-1 \\ a & b-a & c-a \\ a^3 & b^3-a^3 & c^3-a^3 \end{vmatrix}$
$\Delta = \begin{vmatrix} 1 & 0 & 0 \\ a & b-a & c-a \\ a^3 & (b-a)(b^2+ab+a^2) & (c-a)(c^2+ac+a^2) \end{vmatrix}$
Taking out common factors $(b-a)$ from $C_2$ and $(c-a)$ from $C_3$:
$\Delta = (b-a)(c-a) \begin{vmatrix} 1 & 0 & 0 \\ a & 1 & 1 \\ a^3 & b^2+ab+a^2 & c^2+ac+a^2 \end{vmatrix}$
Expand the determinant along the first row ($R_1$). The cofactors of the second and third elements in the first row are zero.
$\Delta = (b-a)(c-a) \left[ 1 \begin{vmatrix} 1 & 1 \\ b^2+ab+a^2 & c^2+ac+a^2 \end{vmatrix} - 0 + 0 \right]$
$\Delta = (b-a)(c-a) [1(c^2+ac+a^2) - 1(b^2+ab+a^2)]$
$\Delta = (b-a)(c-a) [c^2+ac+a^2 - b^2-ab-a^2]$
$\Delta = (b-a)(c-a) [c^2-b^2 + ac-ab]$
Factor the terms inside the bracket:
$\Delta = (b-a)(c-a) [(c-b)(c+b) + a(c-b)]$
Take out the common factor $(c-b)$ from the terms in the bracket:
$\Delta = (b-a)(c-a) (c-b) [(c+b) + a]$
$\Delta = (b-a)(c-a) (c-b) (a+b+c)$
Rearranging the terms to match the required form $(a-b)(b-c)(c-a)(a+b+c)$:
We know that $(b-a) = -(a-b)$ and $(c-b) = -(b-c)$.
$\Delta = -(a-b) \cdot (c-a) \cdot -(b-c) \cdot (a+b+c)$
$\Delta = (-1)(-1) (a-b)(b-c)(c-a)(a+b+c)$
$\Delta = (a-b)(b-c)(c-a)(a+b+c)$
This is the Right Hand Side (RHS).
Hence, $\begin{vmatrix} 1&1&1\\a&b&c\\a^3&b^3&c^3 \end{vmatrix} = (a - b) (b - c) (c - a) (a + b + c)$ is proved.
Question 9. $\begin{vmatrix} x&x^2&yz\\y&y^2&zx\\z&z^2&xy \end{vmatrix} = (x – y) (y – z) (z – x) (xy + yz + zx)$
Answer:
Solution:
Let the given determinant be $\Delta$.
$\Delta = \begin{vmatrix} x&x^2&yz\\y&y^2&zx\\z&z^2&xy \end{vmatrix}$
Apply the row transformations $R_1 \to R_1 - R_2$ and $R_2 \to R_2 - R_3$.
$\Delta = \begin{vmatrix} x-y & x^2-y^2 & yz-zx \\ y-z & y^2-z^2 & zx-xy \\ z & z^2 & xy \end{vmatrix}$
Factor out $(x-y)$ from $R_1$ and $(y-z)$ from $R_2$. Note that:
$x^2-y^2 = (x-y)(x+y)$
$yz-zx = z(y-x) = -z(x-y)$
$y^2-z^2 = (y-z)(y+z)$
$zx-xy = x(z-y) = -x(y-z)$
So, taking common factors from $R_1$ and $R_2$, we get:
$\Delta = (x-y)(y-z) \begin{vmatrix} 1 & x+y & -z \\ 1 & y+z & -x \\ z & z^2 & xy \end{vmatrix}$
Now, apply the row transformation $R_1 \to R_1 - R_2$ on the determinant.
$\Delta = (x-y)(y-z) \begin{vmatrix} 1-1 & (x+y)-(y+z) & -z-(-x) \\ 1 & y+z & -x \\ z & z^2 & xy \end{vmatrix}$
$\Delta = (x-y)(y-z) \begin{vmatrix} 0 & x-z & x-z \\ 1 & y+z & -x \\ z & z^2 & xy \end{vmatrix}$
Factor out $(x-z)$ from $R_1$.
$\Delta = (x-y)(y-z)(x-z) \begin{vmatrix} 0 & 1 & 1 \\ 1 & y+z & -x \\ z & z^2 & xy \end{vmatrix}$
Expand the remaining determinant along the first row ($R_1$).
Expanding along $R_1$, we have:
$\begin{vmatrix} 0 & 1 & 1 \\ 1 & y+z & -x \\ z & z^2 & xy \end{vmatrix} = 0 \cdot \begin{vmatrix} y+z & -x \\ z^2 & xy \end{vmatrix} - 1 \cdot \begin{vmatrix} 1 & -x \\ z & xy \end{vmatrix} + 1 \cdot \begin{vmatrix} 1 & y+z \\ z & z^2 \end{vmatrix}$
$= 0 - (1 \cdot xy - (-x) \cdot z) + (1 \cdot z^2 - (y+z) \cdot z)$
$= -(xy + xz) + (z^2 - (yz + z^2))$
$= -xy - xz + z^2 - yz - z^2$
$= -xy - xz - yz$
$= -(xy + yz + zx)$
Substitute this back into the expression for $\Delta$:
$\Delta = (x-y)(y-z)(x-z) \cdot (-(xy + yz + zx))$
We can rewrite $(x-z)$ as $-(z-x)$. So,
$\Delta = (x-y)(y-z) \cdot (-(z-x)) \cdot (-(xy + yz + zx))$
$\Delta = (x-y)(y-z)(z-x)(xy + yz + zx)$
Thus, we have shown that $\begin{vmatrix} x&x^2&yz\\y&y^2&zx\\z&z^2&xy \end{vmatrix} = (x – y) (y – z) (z – x) (xy + yz + zx)$.
Hence Proved.
Question 10.
(i) $\begin{vmatrix} x+4&2x&2x\\2x&x+4&2x\\2x&2x&x+4 \end{vmatrix} = (5x + 4) (4 - x)^2$
(ii) $\begin{vmatrix} y+k&y&y\\y&y+k&y\\y&y&y+k \end{vmatrix} = k^2 (3y + k)$
Answer:
Solution (i):
Let $\Delta_1 = \begin{vmatrix} x+4&2x&2x\\2x&x+4&2x\\2x&2x&x+4 \end{vmatrix}$.
We need to prove that $\Delta_1 = (5x + 4) (4 - x)^2$.
Apply the operation $C_1 \to C_1 + C_2 + C_3$ to the determinant:
$\Delta_1 = \begin{vmatrix} x+4+2x+2x&2x&2x\\2x+x+4+2x&x+4&2x\\2x+2x+x+4&2x&x+4 \end{vmatrix}$
$\Delta_1 = \begin{vmatrix} 5x+4&2x&2x\\5x+4&x+4&2x\\5x+4&2x&x+4 \end{vmatrix}$
Take out the common factor $(5x+4)$ from $C_1$:
$\Delta_1 = (5x+4) \begin{vmatrix} 1&2x&2x\\1&x+4&2x\\1&2x&x+4 \end{vmatrix}$
Apply row operations $R_2 \to R_2 - R_1$ and $R_3 \to R_3 - R_1$ to create zeros in the first column:
$\Delta_1 = (5x+4) \begin{vmatrix} 1 & 2x & 2x \\ 1-1 & (x+4)-2x & 2x-2x \\ 1-1 & 2x-2x & (x+4)-2x \end{vmatrix}$
$\Delta_1 = (5x+4) \begin{vmatrix} 1 & 2x & 2x \\ 0 & 4-x & 0 \\ 0 & 0 & 4-x \end{vmatrix}$
Expand the determinant along the first column ($C_1$). Only the first element contributes since the others are zero:
$\Delta_1 = (5x+4) \cdot 1 \cdot \begin{vmatrix} 4-x & 0 \\ 0 & 4-x \end{vmatrix}$
Evaluate the $2 \times 2$ determinant:
$\begin{vmatrix} 4-x & 0 \\ 0 & 4-x \end{vmatrix} = (4-x)(4-x) - 0 \cdot 0 = (4-x)^2$
Substitute this back into the expression for $\Delta_1$:
$\Delta_1 = (5x+4)(4-x)^2$
This matches the right-hand side of the equation we needed to prove.
Hence Proved (i).
Solution (ii):
Let $\Delta_2 = \begin{vmatrix} y+k&y&y\\y&y+k&y\\y&y&y+k \end{vmatrix}$.
We need to prove that $\Delta_2 = k^2 (3y + k)$.
Apply the operation $C_1 \to C_1 + C_2 + C_3$ to the determinant:
$\Delta_2 = \begin{vmatrix} y+k+y+y&y&y\\y+y+k+y&y+k&y\\y+y+y+k&y&y+k \end{vmatrix}$
$\Delta_2 = \begin{vmatrix} 3y+k&y&y\\3y+k&y+k&y\\3y+k&y&y+k \end{vmatrix}$
Take out the common factor $(3y+k)$ from $C_1$:
$\Delta_2 = (3y+k) \begin{vmatrix} 1&y&y\\1&y+k&y\\1&y&y+k \end{vmatrix}$
Apply row operations $R_2 \to R_2 - R_1$ and $R_3 \to R_3 - R_1$ to create zeros in the first column:
$\Delta_2 = (3y+k) \begin{vmatrix} 1 & y & y \\ 1-1 & (y+k)-y & y-y \\ 1-1 & y-y & (y+k)-y \end{vmatrix}$
$\Delta_2 = (3y+k) \begin{vmatrix} 1 & y & y \\ 0 & k & 0 \\ 0 & 0 & k \end{vmatrix}$
Expand the determinant along the first column ($C_1$). Only the first element contributes since the others are zero:
$\Delta_2 = (3y+k) \cdot 1 \cdot \begin{vmatrix} k & 0 \\ 0 & k \end{vmatrix}$
Evaluate the $2 \times 2$ determinant:
$\begin{vmatrix} k & 0 \\ 0 & k \end{vmatrix} = k \cdot k - 0 \cdot 0 = k^2$
Substitute this back into the expression for $\Delta_2$:
$\Delta_2 = (3y+k) k^2 = k^2 (3y+k)$
This matches the right-hand side of the equation we needed to prove.
Hence Proved (ii).
Question 11.
(i) $\begin{vmatrix} a−b−c&2a&2a\\2b&b−c−a&2b\\2c&2c&c−a−b \end{vmatrix} = (a + b + c)^3$
(ii) $\begin{vmatrix} x+y+2z&x&y\\z&y+z+2x&y\\z&x&z+x+2y \end{vmatrix} = 2(x + y + z)^3$
Answer:
Solution (i):
Let $\Delta_1 = \begin{vmatrix} a−b−c&2a&2a\\2b&b−c−a&2b\\2c&2c&c−a−b \end{vmatrix}$.
We need to prove that $\Delta_1 = (a + b + c)^3$.
Apply the row operation $R_1 \to R_1 + R_2 + R_3$:
$\Delta_1 = \begin{vmatrix} (a-b-c)+2b+2c & 2a+(b-c-a)+2c & 2a+2b+(c-a-b) \\ 2b & b-c-a & 2b \\ 2c & 2c & c-a-b \end{vmatrix}$
$\Delta_1 = \begin{vmatrix} a+b+c & a+b+c & a+b+c \\ 2b & b-c-a & 2b \\ 2c & 2c & c-a-b \end{vmatrix}$
Take out the common factor $(a+b+c)$ from $R_1$:
$\Delta_1 = (a+b+c) \begin{vmatrix} 1&1&1\\2b&b−c−a&2b\\2c&2c&c−a−b \end{vmatrix}$
Apply column operations $C_2 \to C_2 - C_1$ and $C_3 \to C_3 - C_1$:
$\Delta_1 = (a+b+c) \begin{vmatrix} 1 & 1-1 & 1-1 \\ 2b & (b-c-a)-2b & 2b-2b \\ 2c & 2c-2c & (c-a-b)-2c \end{vmatrix}$
$\Delta_1 = (a+b+c) \begin{vmatrix} 1 & 0 & 0 \\ 2b & -b-c-a & 0 \\ 2c & 0 & -c-a-b \end{vmatrix}$
$\Delta_1 = (a+b+c) \begin{vmatrix} 1 & 0 & 0 \\ 2b & -(a+b+c) & 0 \\ 2c & 0 & -(a+b+c) \end{vmatrix}$
Expand the determinant along the first row ($R_1$). Only the first element is non-zero:
$\Delta_1 = (a+b+c) \cdot 1 \cdot \begin{vmatrix} -(a+b+c) & 0 \\ 0 & -(a+b+c) \end{vmatrix}$
Evaluate the $2 \times 2$ determinant:
$\begin{vmatrix} -(a+b+c) & 0 \\ 0 & -(a+b+c) \end{vmatrix} = (-(a+b+c)) \cdot (-(a+b+c)) - 0 \cdot 0 = (a+b+c)^2$
Substitute this back:
$\Delta_1 = (a+b+c) \cdot (a+b+c)^2 = (a+b+c)^3$
This matches the right-hand side of the equation.
Hence Proved (i).
Solution (ii):
Let $\Delta_2 = \begin{vmatrix} x+y+2z&x&y\\z&y+z+2x&y\\z&x&z+x+2y \end{vmatrix}$.
We need to prove that $\Delta_2 = 2(x + y + z)^3$.
Apply the column operation $C_1 \to C_1 + C_2 + C_3$:
$\Delta_2 = \begin{vmatrix} (x+y+2z)+x+y & x & y \\ z+(y+z+2x)+y & y+z+2x & y \\ z+x+(z+x+2y) & x & z+x+2y \end{vmatrix}$
$\Delta_2 = \begin{vmatrix} 2x+2y+2z & x & y \\ 2x+2y+2z & y+z+2x & y \\ 2x+2y+2z & x & z+x+2y \end{vmatrix}$
$\Delta_2 = \begin{vmatrix} 2(x+y+z) & x & y \\ 2(x+y+z) & y+z+2x & y \\ 2(x+y+z) & x & z+x+2y \end{vmatrix}$
Take out the common factor $2(x+y+z)$ from $C_1$:
$\Delta_2 = 2(x+y+z) \begin{vmatrix} 1&x&y\\1&y+z+2x&y\\1&x&z+x+2y \end{vmatrix}$
Apply row operations $R_2 \to R_2 - R_1$ and $R_3 \to R_3 - R_1$:
$\Delta_2 = 2(x+y+z) \begin{vmatrix} 1 & x & y \\ 1-1 & (y+z+2x)-x & y-y \\ 1-1 & x-x & (z+x+2y)-y \end{vmatrix}$
$\Delta_2 = 2(x+y+z) \begin{vmatrix} 1 & x & y \\ 0 & y+z+x & 0 \\ 0 & 0 & z+x+y \end{vmatrix}$
$\Delta_2 = 2(x+y+z) \begin{vmatrix} 1 & x & y \\ 0 & x+y+z & 0 \\ 0 & 0 & x+y+z \end{vmatrix}$
Expand the determinant along the first column ($C_1$). Only the first element is non-zero:
$\Delta_2 = 2(x+y+z) \cdot 1 \cdot \begin{vmatrix} x+y+z & 0 \\ 0 & x+y+z \end{vmatrix}$
Evaluate the $2 \times 2$ determinant:
$\begin{vmatrix} x+y+z & 0 \\ 0 & x+y+z \end{vmatrix} = (x+y+z) \cdot (x+y+z) - 0 \cdot 0 = (x+y+z)^2$
Substitute this back:
$\Delta_2 = 2(x+y+z) \cdot (x+y+z)^2 = 2(x+y+z)^3$
This matches the right-hand side of the equation.
Hence Proved (ii).
Question 12. $\begin{vmatrix} 1&x&x^2\\x^2&1&x\\x&x^2&1 \end{vmatrix} = (1 - x^3)^2$
Answer:
Solution:
Let $\Delta = \begin{vmatrix} 1&x&x^2\\x^2&1&x\\x&x^2&1 \end{vmatrix}$.
We need to prove that $\Delta = (1 - x^3)^2$.
Apply the column operation $C_1 \to C_1 + C_2 + C_3$:
$\Delta = \begin{vmatrix} 1+x+x^2 & x & x^2 \\ x^2+1+x & 1 & x \\ x+x^2+1 & x^2 & 1 \end{vmatrix}$
$\Delta = \begin{vmatrix} 1+x+x^2 & x & x^2 \\ 1+x+x^2 & 1 & x \\ 1+x+x^2 & x^2 & 1 \end{vmatrix}$
Take out the common factor $(1+x+x^2)$ from $C_1$:
$\Delta = (1+x+x^2) \begin{vmatrix} 1&x&x^2\\1&1&x\\1&x^2&1 \end{vmatrix}$
Apply row operations $R_2 \to R_2 - R_1$ and $R_3 \to R_3 - R_1$ to create zeros in the first column:
$\Delta = (1+x+x^2) \begin{vmatrix} 1 & x & x^2 \\ 1-1 & 1-x & x-x^2 \\ 1-1 & x^2-x & 1-x^2 \end{vmatrix}$
$\Delta = (1+x+x^2) \begin{vmatrix} 1 & x & x^2 \\ 0 & 1-x & x(1-x) \\ 0 & x(x-1) & (1-x)(1+x) \end{vmatrix}$
$\Delta = (1+x+x^2) \begin{vmatrix} 1 & x & x^2 \\ 0 & 1-x & x(1-x) \\ 0 & -x(1-x) & (1-x)(1+x) \end{vmatrix}$
Expand the determinant along the first column ($C_1$). Only the element in the first row contributes:
$\Delta = (1+x+x^2) \cdot 1 \cdot \begin{vmatrix} 1-x & x(1-x) \\ -x(1-x) & (1-x)(1+x) \end{vmatrix}$
Evaluate the $2 \times 2$ determinant:
$\begin{vmatrix} 1-x & x(1-x) \\ -x(1-x) & (1-x)(1+x) \end{vmatrix} = (1-x) \cdot (1-x)(1+x) - (x(1-x)) \cdot (-x(1-x))$
$= (1-x)^2(1+x) + x^2(1-x)^2$
$= (1-x)^2 [(1+x) + x^2]$
$= (1-x)^2 (1+x+x^2)$
Substitute this result back into the expression for $\Delta$:
$\Delta = (1+x+x^2) \cdot (1-x)^2 (1+x+x^2)$
$\Delta = (1+x+x^2)^2 (1-x)^2$
$\Delta = [(1+x+x^2)(1-x)]^2$
Recall the factorization for the difference of cubes: $1 - x^3 = (1-x)(1+x+x^2)$.
Using this identity:
$\Delta = [1-x^3]^2$
$\Delta = (1-x^3)^2$
This matches the right-hand side of the equation.
Hence Proved.
Question 13. $\begin{vmatrix} 1+a^2−b^2&2ab&−2b\\2ab&1−a^2+b^2&2a\\2b&−2a&1−a^2−b^2 \end{vmatrix} = (1 + a^2 +b^2)^3$
Answer:
Solution:
Let the given determinant be $D$.
$D = \begin{vmatrix} 1+a^2−b^2&2ab&−2b\\2ab&1−a^2+b^2&2a\\2b&−2a&1−a^2−b^2 \end{vmatrix}$
Apply row operations $R_1 \to R_1 + b R_3$ and $R_2 \to R_2 - a R_3$.
$D = \begin{vmatrix} (1+a^2-b^2) + b(2b) & 2ab + b(-2a) & -2b + b(1-a^2-b^2) \\ 2ab - a(2b) & (1-a^2+b^2) - a(-2a) & 2a - a(1-a^2-b^2) \\ 2b & -2a & 1-a^2-b^2 \end{vmatrix}$
Simplifying the elements:
Row 1:
$(1+a^2-b^2) + 2b^2 = 1+a^2+b^2$
$2ab - 2ab = 0$
$-2b + b - a^2b - b^3 = -b - a^2b - b^3 = -b(1+a^2+b^2)$
Row 2:
$2ab - 2ab = 0$
$(1-a^2+b^2) + 2a^2 = 1+a^2+b^2$
$2a - a + a^3 + ab^2 = a + a^3 + ab^2 = a(1+a^2+b^2)$
The determinant becomes:
$D = \begin{vmatrix} 1+a^2+b^2 & 0 & -b(1+a^2+b^2) \\ 0 & 1+a^2+b^2 & a(1+a^2+b^2) \\ 2b & -2a & 1-a^2-b^2 \end{vmatrix}$
Take $(1+a^2+b^2)$ common from $R_1$ and $R_2$.
$D = (1+a^2+b^2)(1+a^2+b^2) \begin{vmatrix} 1 & 0 & -b \\ 0 & 1 & a \\ 2b & -2a & 1-a^2-b^2 \end{vmatrix}$
$D = (1+a^2+b^2)^2 \begin{vmatrix} 1 & 0 & -b \\ 0 & 1 & a \\ 2b & -2a & 1-a^2-b^2 \end{vmatrix}$
Expand the determinant along the first row ($R_1$).
$D = (1+a^2+b^2)^2 \left[ 1 \cdot \begin{vmatrix} 1 & a \\ -2a & 1-a^2-b^2 \end{vmatrix} - 0 \cdot \begin{vmatrix} 0 & a \\ 2b & 1-a^2-b^2 \end{vmatrix} + (-b) \cdot \begin{vmatrix} 0 & 1 \\ 2b & -2a \end{vmatrix} \right]$
Evaluate the 2x2 determinants:
$\begin{vmatrix} 1 & a \\ -2a & 1-a^2-b^2 \end{vmatrix} = 1(1-a^2-b^2) - a(-2a) = 1-a^2-b^2 + 2a^2 = 1+a^2-b^2$
$\begin{vmatrix} 0 & 1 \\ 2b & -2a \end{vmatrix} = 0(-2a) - 1(2b) = 0 - 2b = -2b$
Substitute these values back into the expansion:
$D = (1+a^2+b^2)^2 \left[ 1 \cdot (1+a^2-b^2) - b \cdot (-2b) \right]$
$D = (1+a^2+b^2)^2 \left[ 1+a^2-b^2 + 2b^2 \right]$
$D = (1+a^2+b^2)^2 \left[ 1+a^2+b^2 \right]$
$D = (1+a^2+b^2)^{2+1}$
$D = (1+a^2+b^2)^3$
Thus, $\begin{vmatrix} 1+a^2−b^2&2ab&−2b\\2ab&1−a^2+b^2&2a\\2b&−2a&1−a^2−b^2 \end{vmatrix} = (1 + a^2 +b^2)^3$.
Hence, Proved.
Question 14. $\begin{vmatrix} a^2+1&ab&ac\\ab&b^2+1&bc\\ca&cb&c^2+1 \end{vmatrix} = 1 + a^2 + b^2 + c^2 $
Answer:
Given:
The determinant $D = \begin{vmatrix} a^2+1&ab&ac\\ab&b^2+1&bc\\ca&cb&c^2+1 \end{vmatrix}$
To Prove:
$D = 1 + a^2 + b^2 + c^2$
Solution:
Let the given determinant be $D$.
$D = \begin{vmatrix} a^2+1&ab&ac\\ab&b^2+1&bc\\ca&cb&c^2+1 \end{vmatrix}$
Multiply $C_1$ by $a$, $C_2$ by $b$, and $C_3$ by $c$. To balance this, divide the determinant by $abc$.
$D = \frac{1}{abc} \begin{vmatrix} a(a^2+1)&b(ab)&c(ac)\\a(ab)&b(b^2+1)&c(bc)\\a(ca)&b(cb)&c(c^2+1) \end{vmatrix}$
$D = \frac{1}{abc} \begin{vmatrix} a^3+a&ab^2&ac^2\\a^2b&b^3+b&bc^2\\a^2c&b^2c&c^3+c \end{vmatrix}$
Now, take $a$ common from $R_1$, $b$ common from $R_2$, and $c$ common from $R_3$.
$D = \frac{abc}{abc} \begin{vmatrix} a^2+1&b^2&c^2\\a^2&b^2+1&c^2\\a^2&b^2&c^2+1 \end{vmatrix}$
$D = \begin{vmatrix} a^2+1&b^2&c^2\\a^2&b^2+1&c^2\\a^2&b^2&c^2+1 \end{vmatrix}$
Apply row operations $R_1 \to R_1 - R_2$ and $R_2 \to R_2 - R_3$.
$D = \begin{vmatrix} (a^2+1) - a^2 & b^2 - (b^2+1) & c^2 - c^2 \\ a^2 - a^2 & (b^2+1) - b^2 & c^2 - (c^2+1) \\ a^2 & b^2 & c^2+1 \end{vmatrix}$
Simplify the elements:
$D = \begin{vmatrix} 1 & -1 & 0 \\ 0 & 1 & -1 \\ a^2 & b^2 & c^2+1 \end{vmatrix}$
Expand the determinant along the first row ($R_1$).
$D = 1 \cdot \begin{vmatrix} 1 & -1 \\ b^2 & c^2+1 \end{vmatrix} - (-1) \cdot \begin{vmatrix} 0 & -1 \\ a^2 & c^2+1 \end{vmatrix} + 0 \cdot \begin{vmatrix} 0 & 1 \\ a^2 & b^2 \end{vmatrix}$
Evaluate the 2x2 determinants:
$\begin{vmatrix} 1 & -1 \\ b^2 & c^2+1 \end{vmatrix} = 1(c^2+1) - (-1)(b^2) = c^2+1+b^2$
$\begin{vmatrix} 0 & -1 \\ a^2 & c^2+1 \end{vmatrix} = 0(c^2+1) - (-1)(a^2) = 0+a^2 = a^2$
$\begin{vmatrix} 0 & 1 \\ a^2 & b^2 \end{vmatrix} = 0(b^2) - 1(a^2) = 0-a^2 = -a^2$ (Note: This term is multiplied by 0 in the expansion, so its value doesn't affect the final result).
Substitute these values back into the expansion:
$D = 1 \cdot (c^2+1+b^2) + 1 \cdot (a^2) + 0 \cdot (-a^2)$
$D = c^2+1+b^2 + a^2 + 0$
$D = 1 + a^2 + b^2 + c^2$
This is the required expression on the right-hand side.
Hence, Proved.
Choose the correct answer in Exercises 15 and 16.
Question 15. Let A be a square matrix of order 3 × 3, then | kA| is equal to
(A) k| A |
(B) k2| A |
(C) k3| A |
(D) 3k | A |
Answer:
Solution:
Let $A$ be a square matrix of order $n \times n$. The property of determinants states that for any scalar $k$, $|kA| = k^n |A|$.
In this question, $A$ is a square matrix of order $3 \times 3$. So, $n = 3$.
Using the property, we have:
$|kA| = k^3 |A|$
Therefore, the correct answer is $k^3 |A|$.
Comparing with the given options:
(A) $k| A |$
(B) $k^2| A |$
(C) $k^3| A |$
(D) $3k | A |$
The correct option is (C).
Question 16. Which of the following is correct
(A) Determinant is a square matrix.
(B) Determinant is a number associated to a matrix.
(C) Determinant is a number associated to a square matrix.
(D) None of these
Answer:
Solution:
Let's examine each option:
(A) Determinant is a square matrix.
This statement is incorrect. A determinant is a scalar value (a number), not a matrix.
(B) Determinant is a number associated to a matrix.
This statement is partially correct, but not precise. Determinants are only defined for square matrices.
(C) Determinant is a number associated to a square matrix.
This statement is correct. The determinant is a unique scalar value calculated from the elements of a square matrix.
(D) None of these
This statement is incorrect because option (C) is correct.
Therefore, the correct statement is that a determinant is a number associated with a square matrix.
The correct option is (C) Determinant is a number associated to a square matrix.
Example 17 & 18 (Before Exercise 4.3)
Example 17: Find the area of the triangle whose vertices are (3, 8), (– 4, 2) and (5, 1).
Answer:
Given:
The vertices of the triangle are $(x_1, y_1) = (3, 8)$, $(x_2, y_2) = (-4, 2)$, and $(x_3, y_3) = (5, 1)$.
To Find:
Area of the triangle.
Solution:
The area of a triangle with vertices $(x_1, y_1)$, $(x_2, y_2)$, and $(x_3, y_3)$ is given by the formula:
$\text{Area} = \frac{1}{2} \left| \begin{vmatrix} x_1 & y_1 & 1 \\ x_2 & y_2 & 1 \\ x_3 & y_3 & 1 \end{vmatrix} \right|$
Substitute the coordinates of the given vertices into the determinant:
$\text{Area} = \frac{1}{2} \left| \begin{vmatrix} 3 & 8 & 1 \\ -4 & 2 & 1 \\ 5 & 1 & 1 \end{vmatrix} \right|$
Evaluate the determinant by expanding along the first row ($R_1$):
Determinant value $= 3 \begin{vmatrix} 2 & 1 \\ 1 & 1 \end{vmatrix} - 8 \begin{vmatrix} -4 & 1 \\ 5 & 1 \end{vmatrix} + 1 \begin{vmatrix} -4 & 2 \\ 5 & 1 \end{vmatrix}$
$= 3((2)(1) - (1)(1)) - 8((-4)(1) - (1)(5)) + 1((-4)(1) - (2)(5))$
$= 3(2 - 1) - 8(-4 - 5) + 1(-4 - 10)$
$= 3(1) - 8(-9) + 1(-14)$
$= 3 + 72 - 14$
$= 75 - 14$
$= 61$
Now, calculate the area using the formula:
$\text{Area} = \frac{1}{2} |61|$
$\text{Area} = \frac{61}{2}$
Since the area must be positive, the area of the triangle is $\frac{61}{2}$ square units.
Example 18: Find the equation of the line joining A(1, 3) and B (0, 0) using determinants and find k if D(k, 0) is a point such that area of triangle ABD is 3 sq units.
Answer:
Given:
Points A(1, 3) and B(0, 0).
Point D(k, 0).
Area of triangle ABD = 3 sq units.
To Find:
The equation of the line joining A and B using determinants.
The value(s) of k such that the area of triangle ABD is 3 sq units.
Solution:
Part 1: Equation of the line joining A(1, 3) and B(0, 0)
Let P(x, y) be any point on the line joining points A(1, 3) and B(0, 0).
For the three points A, B, and P to be collinear, the area of the triangle formed by these points must be zero.
The area of a triangle with vertices $(x_1, y_1)$, $(x_2, y_2)$, and $(x_3, y_3)$ is $\frac{1}{2} \left| \begin{vmatrix} x_1 & y_1 & 1 \\ x_2 & y_2 & 1 \\ x_3 & y_3 & 1 \end{vmatrix} \right|$.
Here, $(x_1, y_1) = (1, 3)$, $(x_2, y_2) = (0, 0)$, and $(x_3, y_3) = (x, y)$.
Setting the area to zero for collinear points:
$\frac{1}{2} \left| \begin{vmatrix} 1 & 3 & 1 \\ 0 & 0 & 1 \\ x & y & 1 \end{vmatrix} \right| = 0$
This implies the determinant value must be zero:
$\begin{vmatrix} 1 & 3 & 1 \\ 0 & 0 & 1 \\ x & y & 1 \end{vmatrix} = 0$
Expand the determinant along the second row ($R_2$), as it contains two zeros:
$0 \cdot \begin{vmatrix} 3 & 1 \\ y & 1 \end{vmatrix} - 0 \cdot \begin{vmatrix} 1 & 1 \\ x & 1 \end{vmatrix} + 1 \cdot \begin{vmatrix} 1 & 3 \\ x & y \end{vmatrix} = 0$
$0 - 0 + (1 \cdot y - 3 \cdot x) = 0$
$y - 3x = 0$
$y = 3x$
or
$3x - y = 0$
Thus, the equation of the line joining A(1, 3) and B(0, 0) is $y = 3x$ or $3x - y = 0$.
Part 2: Find k if Area of triangle ABD = 3 sq units
The vertices of triangle ABD are A(1, 3), B(0, 0), and D(k, 0).
The area of triangle ABD is given as 3 sq units.
Using the determinant formula for the area:
$\text{Area of } \triangle ABD = \frac{1}{2} \left| \begin{vmatrix} x_A & y_A & 1 \\ x_B & y_B & 1 \\ x_D & y_D & 1 \end{vmatrix} \right|$
Substitute the coordinates of A(1, 3), B(0, 0), and D(k, 0):
$3 = \frac{1}{2} \left| \begin{vmatrix} 1 & 3 & 1 \\ 0 & 0 & 1 \\ k & 0 & 1 \end{vmatrix} \right|$
Multiply by 2 on both sides:
$6 = \left| \begin{vmatrix} 1 & 3 & 1 \\ 0 & 0 & 1 \\ k & 0 & 1 \end{vmatrix} \right|$
Evaluate the determinant. Expanding along the second row ($R_2$) is convenient:
Determinant value $= 0 \cdot \begin{vmatrix} 3 & 1 \\ 0 & 1 \end{vmatrix} - 0 \cdot \begin{vmatrix} 1 & 1 \\ k & 1 \end{vmatrix} + 1 \cdot \begin{vmatrix} 1 & 3 \\ k & 0 \end{vmatrix}$
$= 0 - 0 + (1 \cdot 0 - 3 \cdot k)$
$= 0 - 3k$
$= -3k$
So, the equation becomes:
$6 = |-3k|$
The absolute value means we have two possibilities:
Case 1: $-3k = 6$
$k = \frac{6}{-3}$
$k = -2$
Case 2: $-3k = -6$
$k = \frac{-6}{-3}$
$k = 2$
Thus, the possible values of k are 2 and -2.
Exercise 4.3
Question 1. Find area of the triangle with vertices at the point given in each of the following :
(i) (1, 0), (6, 0), (4, 3)
(ii) (2, 7), (1, 1), (10, 8)
(iii) (–2, –3), (3, 2), (–1, –8)
Answer:
The area of a triangle with vertices $(x_1, y_1)$, $(x_2, y_2)$, and $(x_3, y_3)$ is given by the formula:
$\text{Area} = \frac{1}{2} \left| \begin{vmatrix} x_1 & y_1 & 1 \\ x_2 & y_2 & 1 \\ x_3 & y_3 & 1 \end{vmatrix} \right|$
(i) Vertices: (1, 0), (6, 0), (4, 3)
Let $(x_1, y_1) = (1, 0)$, $(x_2, y_2) = (6, 0)$, and $(x_3, y_3) = (4, 3)$.
Area $= \frac{1}{2} \left| \begin{vmatrix} 1 & 0 & 1 \\ 6 & 0 & 1 \\ 4 & 3 & 1 \end{vmatrix} \right|$
Expand the determinant along the second column ($C_2$):
Determinant value $= 0 \cdot (\text{cofactor}) - 0 \cdot (\text{cofactor}) + 3 \cdot \begin{vmatrix} 1 & 1 \\ 6 & 1 \end{vmatrix}$
$= 3 (1 \cdot 1 - 1 \cdot 6)$
$= 3 (1 - 6)$
$= 3 (-5)$
$= -15$
Area $= \frac{1}{2} |-15|$
Area $= \frac{1}{2} \cdot 15$
Area $= \frac{15}{2}$ square units.
(ii) Vertices: (2, 7), (1, 1), (10, 8)
Let $(x_1, y_1) = (2, 7)$, $(x_2, y_2) = (1, 1)$, and $(x_3, y_3) = (10, 8)$.
Area $= \frac{1}{2} \left| \begin{vmatrix} 2 & 7 & 1 \\ 1 & 1 & 1 \\ 10 & 8 & 1 \end{vmatrix} \right|$
Expand the determinant along the first row ($R_1$):
Determinant value $= 2 \begin{vmatrix} 1 & 1 \\ 8 & 1 \end{vmatrix} - 7 \begin{vmatrix} 1 & 1 \\ 10 & 1 \end{vmatrix} + 1 \begin{vmatrix} 1 & 1 \\ 10 & 8 \end{vmatrix}$
$= 2 (1 \cdot 1 - 1 \cdot 8) - 7 (1 \cdot 1 - 1 \cdot 10) + 1 (1 \cdot 8 - 1 \cdot 10)$
$= 2 (1 - 8) - 7 (1 - 10) + 1 (8 - 10)$
$= 2 (-7) - 7 (-9) + 1 (-2)$
$= -14 + 63 - 2$
$= 49 - 2$
$= 47$
Area $= \frac{1}{2} |47|$
Area $= \frac{1}{2} \cdot 47$
Area $= \frac{47}{2}$ square units.
(iii) Vertices: (–2, –3), (3, 2), (–1, –8)
Let $(x_1, y_1) = (-2, -3)$, $(x_2, y_2) = (3, 2)$, and $(x_3, y_3) = (-1, -8)$.
Area $= \frac{1}{2} \left| \begin{vmatrix} -2 & -3 & 1 \\ 3 & 2 & 1 \\ -1 & -8 & 1 \end{vmatrix} \right|$
Expand the determinant along the first row ($R_1$):
Determinant value $= -2 \begin{vmatrix} 2 & 1 \\ -8 & 1 \end{vmatrix} - (-3) \begin{vmatrix} 3 & 1 \\ -1 & 1 \end{vmatrix} + 1 \begin{vmatrix} 3 & 2 \\ -1 & -8 \end{vmatrix}$
$= -2 (2 \cdot 1 - 1 \cdot (-8)) + 3 (3 \cdot 1 - 1 \cdot (-1)) + 1 (3 \cdot (-8) - 2 \cdot (-1))$
$= -2 (2 + 8) + 3 (3 + 1) + 1 (-24 + 2)$
$= -2 (10) + 3 (4) + 1 (-22)$
$= -20 + 12 - 22$
$= -8 - 22$
$= -30$
Area $= \frac{1}{2} |-30|$
Area $= \frac{1}{2} \cdot 30$
Area $= 15$ square units.
Question 2. Show that points
A (a, b + c), B (b, c + a), C (c, a + b) are collinear.
Answer:
Given:
The points are A (a, b + c), B (b, c + a), and C (c, a + b).
To Prove:
The points A, B, and C are collinear.
Solution:
Points A, B, and C are collinear if and only if the area of the triangle formed by these points is zero.
The area of a triangle with vertices $(x_1, y_1)$, $(x_2, y_2)$, and $(x_3, y_3)$ is given by the formula:
$\text{Area} = \frac{1}{2} \left| \begin{vmatrix} x_1 & y_1 & 1 \\ x_2 & y_2 & 1 \\ x_3 & y_3 & 1 \end{vmatrix} \right|$
Here, $(x_1, y_1) = (a, b+c)$, $(x_2, y_2) = (b, c+a)$, and $(x_3, y_3) = (c, a+b)$.
The area of the triangle formed by points A, B, and C is:
Area $= \frac{1}{2} \left| \begin{vmatrix} a & b+c & 1 \\ b & c+a & 1 \\ c & a+b & 1 \end{vmatrix} \right|$
Let's evaluate the determinant:
$D = \begin{vmatrix} a & b+c & 1 \\ b & c+a & 1 \\ c & a+b & 1 \end{vmatrix}$
Apply the column operation $C_1 \to C_1 + C_2$.
$D = \begin{vmatrix} a + (b+c) & b+c & 1 \\ b + (c+a) & c+a & 1 \\ c + (a+b) & a+b & 1 \end{vmatrix}$
$D = \begin{vmatrix} a+b+c & b+c & 1 \\ a+b+c & c+a & 1 \\ a+b+c & a+b & 1 \end{vmatrix}$
Take the common factor $(a+b+c)$ from the first column ($C_1$).
$D = (a+b+c) \begin{vmatrix} 1 & b+c & 1 \\ 1 & c+a & 1 \\ 1 & a+b & 1 \end{vmatrix}$
In the resulting determinant, the first column ($C_1$) and the third column ($C_3$) are identical.
$\begin{vmatrix} 1 & b+c & 1 \\ 1 & c+a & 1 \\ 1 & a+b & 1 \end{vmatrix}$
When two columns (or rows) of a determinant are identical, the value of the determinant is zero.
So, $D = (a+b+c) \cdot 0$
$D = 0$
Now, calculate the area of the triangle:
Area $= \frac{1}{2} |D|$
Area $= \frac{1}{2} |0|$
Area $= 0$
Since the area of the triangle formed by points A, B, and C is 0, the points A, B, and C are collinear.
Hence, Proved.
Question 3. Find values of k if area of triangle is 4 sq. units and vertices are
(i) (k, 0), (4, 0), (0, 2)
(ii) (–2, 0), (0, 4), (0, k)
Answer:
The area of a triangle with vertices $(x_1, y_1)$, $(x_2, y_2)$, and $(x_3, y_3)$ is given by the formula:
$\text{Area} = \frac{1}{2} \left| \begin{vmatrix} x_1 & y_1 & 1 \\ x_2 & y_2 & 1 \\ x_3 & y_3 & 1 \end{vmatrix} \right|$
Given that the area of the triangle is 4 sq. units, the determinant value must satisfy $\frac{1}{2} |\text{Determinant}| = 4$, which means $|\text{Determinant}| = 8$. Therefore, the determinant value can be either 8 or -8.
(i) Vertices: (k, 0), (4, 0), (0, 2)
Let $(x_1, y_1) = (k, 0)$, $(x_2, y_2) = (4, 0)$, and $(x_3, y_3) = (0, 2)$.
The determinant is:
$D = \begin{vmatrix} k & 0 & 1 \\ 4 & 0 & 1 \\ 0 & 2 & 1 \end{vmatrix}$
Expand the determinant along the second column ($C_2$):
$D = 0 \cdot (\text{cofactor}) - 0 \cdot (\text{cofactor}) + 2 \cdot \begin{vmatrix} k & 1 \\ 4 & 1 \end{vmatrix}$
$D = 2 (k \cdot 1 - 1 \cdot 4)$
$D = 2 (k - 4)$
Given Area = 4, so $\frac{1}{2} |D| = 4$, which means $|D| = 8$.
$|2(k-4)| = 8$
$|2| |k-4| = 8$
$2 |k-4| = 8$
$|k-4| = 4$
This gives two possible cases:
Case 1: $k-4 = 4$
$k = 4 + 4 = 8$
Case 2: $k-4 = -4$
$k = 4 - 4 = 0$
The values of k are 0 and 8.
(ii) Vertices: (–2, 0), (0, 4), (0, k)
Let $(x_1, y_1) = (-2, 0)$, $(x_2, y_2) = (0, 4)$, and $(x_3, y_3) = (0, k)$.
The determinant is:
$D = \begin{vmatrix} -2 & 0 & 1 \\ 0 & 4 & 1 \\ 0 & k & 1 \end{vmatrix}$
Expand the determinant along the first column ($C_1$):
$D = -2 \cdot \begin{vmatrix} 4 & 1 \\ k & 1 \end{vmatrix} - 0 \cdot (\text{cofactor}) + 0 \cdot (\text{cofactor})$
$D = -2 (4 \cdot 1 - 1 \cdot k)$
$D = -2 (4 - k)$
$D = -8 + 2k$
Given Area = 4, so $\frac{1}{2} |D| = 4$, which means $|D| = 8$.
$|-8 + 2k| = 8$
$|2k - 8| = 8$
$|2(k - 4)| = 8$
$|2| |k - 4| = 8$
$2 |k - 4| = 8$
$|k - 4| = 4$
This gives two possible cases:
Case 1: $k-4 = 4$
$k = 4 + 4 = 8$
Case 2: $k-4 = -4$
$k = 4 - 4 = 0$
The values of k are 0 and 8.
Question 4.
(i) Find equation of line joining (1, 2) and (3, 6) using determinants.
(ii) Find equation of line joining (3, 1) and (9, 3) using determinants.
Answer:
To find the equation of the line joining two points $(x_1, y_1)$ and $(x_2, y_2)$ using determinants, we consider a general point $(x, y)$ on the line. For the three points $(x_1, y_1)$, $(x_2, y_2)$, and $(x, y)$ to be collinear, the area of the triangle formed by them must be zero.
The area of the triangle is given by:
$\text{Area} = \frac{1}{2} \left| \begin{vmatrix} x_1 & y_1 & 1 \\ x_2 & y_2 & 1 \\ x & y & 1 \end{vmatrix} \right|$
For collinear points, Area = 0, so the determinant must be zero:
$\begin{vmatrix} x_1 & y_1 & 1 \\ x_2 & y_2 & 1 \\ x & y & 1 \end{vmatrix} = 0$
(i) Points: (1, 2) and (3, 6)
Let $(x_1, y_1) = (1, 2)$ and $(x_2, y_2) = (3, 6)$. The general point is $(x, y)$.
The determinant for collinearity is:
$\begin{vmatrix} 1 & 2 & 1 \\ 3 & 6 & 1 \\ x & y & 1 \end{vmatrix} = 0$
Expand the determinant (e.g., along the first row):
$1 \cdot \begin{vmatrix} 6 & 1 \\ y & 1 \end{vmatrix} - 2 \cdot \begin{vmatrix} 3 & 1 \\ x & 1 \end{vmatrix} + 1 \cdot \begin{vmatrix} 3 & 6 \\ x & y \end{vmatrix} = 0$
$1(6 \cdot 1 - 1 \cdot y) - 2(3 \cdot 1 - 1 \cdot x) + 1(3 \cdot y - 6 \cdot x) = 0$
$(6 - y) - 2(3 - x) + (3y - 6x) = 0$
$6 - y - 6 + 2x + 3y - 6x = 0$
Combine like terms:
$(2x - 6x) + (-y + 3y) + (6 - 6) = 0$
$-4x + 2y = 0$
Dividing the entire equation by -2:
$2x - y = 0$
Thus, the equation of the line joining (1, 2) and (3, 6) is $2x - y = 0$ or $y = 2x$.
(ii) Points: (3, 1) and (9, 3)
Let $(x_1, y_1) = (3, 1)$ and $(x_2, y_2) = (9, 3)$. The general point is $(x, y)$.
The determinant for collinearity is:
$\begin{vmatrix} 3 & 1 & 1 \\ 9 & 3 & 1 \\ x & y & 1 \end{vmatrix} = 0$
Expand the determinant (e.g., along the first row):
$3 \cdot \begin{vmatrix} 3 & 1 \\ y & 1 \end{vmatrix} - 1 \cdot \begin{vmatrix} 9 & 1 \\ x & 1 \end{vmatrix} + 1 \cdot \begin{vmatrix} 9 & 3 \\ x & y \end{vmatrix} = 0$
$3(3 \cdot 1 - 1 \cdot y) - 1(9 \cdot 1 - 1 \cdot x) + 1(9 \cdot y - 3 \cdot x) = 0$
$3(3 - y) - (9 - x) + (9y - 3x) = 0$
$9 - 3y - 9 + x + 9y - 3x = 0$
Combine like terms:
$(x - 3x) + (-3y + 9y) + (9 - 9) = 0$
$-2x + 6y = 0$
Dividing the entire equation by -2:
$x - 3y = 0$
Thus, the equation of the line joining (3, 1) and (9, 3) is $x - 3y = 0$ or $y = \frac{1}{3}x$.
Question 5. If area of triangle is 35 sq units with vertices (2, – 6), (5, 4) and (k, 4). Then k is
(A) 12
(B) –2
(C) –12, –2
(D) 12, –2
Answer:
Solution:
The vertices of the triangle are given as $(x_1, y_1) = (2, -6)$, $(x_2, y_2) = (5, 4)$, and $(x_3, y_3) = (k, 4)$.
The area of the triangle is given as 35 sq units.
The area of a triangle with vertices $(x_1, y_1)$, $(x_2, y_2)$, and $(x_3, y_3)$ is given by the formula:
$\text{Area} = \frac{1}{2} \left| \begin{vmatrix} x_1 & y_1 & 1 \\ x_2 & y_2 & 1 \\ x_3 & y_3 & 1 \end{vmatrix} \right|$
Substitute the given vertices and area into the formula:
$35 = \frac{1}{2} \left| \begin{vmatrix} 2 & -6 & 1 \\ 5 & 4 & 1 \\ k & 4 & 1 \end{vmatrix} \right|$
Multiply both sides by 2:
$70 = \left| \begin{vmatrix} 2 & -6 & 1 \\ 5 & 4 & 1 \\ k & 4 & 1 \end{vmatrix} \right|$
Evaluate the determinant. Expand along the first row ($R_1$):
Determinant value $= 2 \begin{vmatrix} 4 & 1 \\ 4 & 1 \end{vmatrix} - (-6) \begin{vmatrix} 5 & 1 \\ k & 1 \end{vmatrix} + 1 \begin{vmatrix} 5 & 4 \\ k & 4 \end{vmatrix}$
$= 2 ((4)(1) - (1)(4)) + 6 ((5)(1) - (1)(k)) + 1 ((5)(4) - (4)(k))$
$= 2 (4 - 4) + 6 (5 - k) + (20 - 4k)$
$= 2 (0) + 30 - 6k + 20 - 4k$
$= 0 + 50 - 10k$
$= 50 - 10k$
Now, we have the equation involving the absolute value of the determinant:
$|50 - 10k| = 70$
This equation leads to two possibilities:
Case 1: $50 - 10k = 70$
$-10k = 70 - 50$
$-10k = 20$
$k = \frac{20}{-10}$
$k = -2$
Case 2: $50 - 10k = -70$
$-10k = -70 - 50$
$-10k = -120$
$k = \frac{-120}{-10}$
$k = 12$
The possible values of k are 12 and -2.
Comparing with the given options:
(A) 12
(B) –2
(C) –12, –2
(D) 12, –2
The correct option is (D).
Example 19 to 22 (Before Exercise 4.4)
Example 19: Find the minor of element 6 in the determinant ∆ = $\begin{vmatrix} 1&2&3\\4&5&6\\7&8&9 \end{vmatrix}$
Answer:
Given:
The determinant $\Delta = \begin{vmatrix} 1&2&3\\4&5&6\\7&8&9 \end{vmatrix}$.
The element is 6.
To Find:
The minor of the element 6.
Solution:
The element 6 is in the second row and the third column of the determinant.
Let $a_{ij}$ represent the element in the $i$-th row and $j$-th column. The element 6 is $a_{23}$.
The minor $M_{ij}$ of an element $a_{ij}$ is the determinant of the submatrix obtained by deleting the $i$-th row and $j$-th column.
For the element $a_{23}=6$, the minor is $M_{23}$.
Delete the 2nd row ($R_2$) and the 3rd column ($C_3$) from the determinant $\Delta$:
$\begin{vmatrix} 1&2&\cancel{3}\\ \cancel{4}&\cancel{5}&\cancel{6}\\ 7&8&\cancel{9} \end{vmatrix}$
The remaining submatrix is $\begin{vmatrix} 1&2\\7&8 \end{vmatrix}$.
Calculate the determinant of this submatrix to find the minor $M_{23}$:
$M_{23} = \begin{vmatrix} 1&2\\7&8 \end{vmatrix}$
$M_{23} = (1 \times 8) - (2 \times 7)$
$M_{23} = 8 - 14$
$M_{23} = -6$
The minor of the element 6 is -6.
Example 20: Find minors and cofactors of all the elements of the determinant $\begin{vmatrix} 1&−2\\4&3 \end{vmatrix}$
Answer:
Given:
The determinant $\begin{vmatrix} 1&−2\\4&3 \end{vmatrix}$.
To Find:
Minors and cofactors of all elements.
Solution:
Let the determinant be denoted by $\Delta$. The elements are $a_{11}=1$, $a_{12}=-2$, $a_{21}=4$, and $a_{22}=3$.
The Minor $M_{ij}$ of an element $a_{ij}$ is the determinant of the submatrix obtained by deleting the $i$-th row and $j$-th column.
The Cofactor $A_{ij}$ of an element $a_{ij}$ is given by $A_{ij} = (-1)^{i+j} M_{ij}$.
Calculating Minors:
Minor of $a_{11}=1$: $M_{11}$ is the determinant of the submatrix after deleting row 1 and column 1.
$M_{11} = \begin{vmatrix} 3 \end{vmatrix} = 3$
Minor of $a_{12}=-2$: $M_{12}$ is the determinant of the submatrix after deleting row 1 and column 2.
$M_{12} = \begin{vmatrix} 4 \end{vmatrix} = 4$
Minor of $a_{21}=4$: $M_{21}$ is the determinant of the submatrix after deleting row 2 and column 1.
$M_{21} = \begin{vmatrix} -2 \end{vmatrix} = -2$
Minor of $a_{22}=3$: $M_{22}$ is the determinant of the submatrix after deleting row 2 and column 2.
$M_{22} = \begin{vmatrix} 1 \end{vmatrix} = 1$
Calculating Cofactors:
Cofactor of $a_{11}=1$: $A_{11} = (-1)^{1+1} M_{11} = (-1)^2 \cdot 3 = 1 \cdot 3 = 3$
Cofactor of $a_{12}=-2$: $A_{12} = (-1)^{1+2} M_{12} = (-1)^3 \cdot 4 = -1 \cdot 4 = -4$
Cofactor of $a_{21}=4$: $A_{21} = (-1)^{2+1} M_{21} = (-1)^3 \cdot (-2) = -1 \cdot (-2) = 2$
Cofactor of $a_{22}=3$: $A_{22} = (-1)^{2+2} M_{22} = (-1)^4 \cdot 1 = 1 \cdot 1 = 1$
Summary of Minors and Cofactors:
Minor of element 1 ($a_{11}$): $M_{11} = 3$
Cofactor of element 1 ($a_{11}$): $A_{11} = 3$
Minor of element -2 ($a_{12}$): $M_{12} = 4$
Cofactor of element -2 ($a_{12}$): $A_{12} = -4$
Minor of element 4 ($a_{21}$): $M_{21} = -2$
Cofactor of element 4 ($a_{21}$): $A_{21} = 2$
Minor of element 3 ($a_{22}$): $M_{22} = 1$
Cofactor of element 3 ($a_{22}$): $A_{22} = 1$
Example 21: Find minors and cofactors of the elements a11 , a21 in the determinant
∆ = $\begin{vmatrix} a_{11}&a_{12}&a_{13}\\a_{21}&a_{22}&a_{23}\\a_{31}&a_{32}&a_{33} \end{vmatrix}$
Answer:
Given:
The determinant $\Delta = \begin{vmatrix} a_{11}&a_{12}&a_{13}\\a_{21}&a_{22}&a_{23}\\a_{31}&a_{32}&a_{33} \end{vmatrix}$.
The elements are $a_{11}$ and $a_{21}$.
To Find:
Minors and cofactors of the elements $a_{11}$ and $a_{21}$.
Solution:
The Minor $M_{ij}$ of an element $a_{ij}$ is the determinant of the submatrix obtained by deleting the $i$-th row and $j$-th column.
The Cofactor $A_{ij}$ of an element $a_{ij}$ is given by $A_{ij} = (-1)^{i+j} M_{ij}$.
For element $a_{11}$:
$a_{11}$ is in the 1st row ($i=1$) and 1st column ($j=1$).
To find the Minor $M_{11}$, delete the 1st row and 1st column:
$\begin{vmatrix} \cancel{a_{11}}&\cancel{a_{12}}&\cancel{a_{13}}\\ \cancel{a_{21}}&a_{22}&a_{23}\\ \cancel{a_{31}}&a_{32}&a_{33} \end{vmatrix}$
The remaining submatrix is $\begin{vmatrix} a_{22}&a_{23}\\a_{32}&a_{33} \end{vmatrix}$.
The Minor $M_{11}$ is the determinant of this submatrix:
$M_{11} = \begin{vmatrix} a_{22}&a_{23}\\a_{32}&a_{33} \end{vmatrix} = a_{22} a_{33} - a_{23} a_{32}$
The Cofactor $A_{11}$ is given by $A_{11} = (-1)^{1+1} M_{11}$:
$A_{11} = (-1)^2 (a_{22} a_{33} - a_{23} a_{32})$
$A_{11} = a_{22} a_{33} - a_{23} a_{32}$
For element $a_{21}$:
$a_{21}$ is in the 2nd row ($i=2$) and 1st column ($j=1$).
To find the Minor $M_{21}$, delete the 2nd row and 1st column:
$\begin{vmatrix} \cancel{a_{11}}&a_{12}&a_{13}\\ \cancel{a_{21}}&\cancel{a_{22}}&\cancel{a_{23}}\\ \cancel{a_{31}}&a_{32}&a_{33} \end{vmatrix}$
The remaining submatrix is $\begin{vmatrix} a_{12}&a_{13}\\a_{32}&a_{33} \end{vmatrix}$.
The Minor $M_{21}$ is the determinant of this submatrix:
$M_{21} = \begin{vmatrix} a_{12}&a_{13}\\a_{32}&a_{33} \end{vmatrix} = a_{12} a_{33} - a_{13} a_{32}$
The Cofactor $A_{21}$ is given by $A_{21} = (-1)^{2+1} M_{21}$:
$A_{21} = (-1)^3 (a_{12} a_{33} - a_{13} a_{32})$
$A_{21} = - (a_{12} a_{33} - a_{13} a_{32})$
$A_{21} = a_{13} a_{32} - a_{12} a_{33}$
Summary:
Minor of $a_{11}$: $M_{11} = a_{22} a_{33} - a_{23} a_{32}$
Cofactor of $a_{11}$: $A_{11} = a_{22} a_{33} - a_{23} a_{32}$
Minor of $a_{21}$: $M_{21} = a_{12} a_{33} - a_{13} a_{32}$
Cofactor of $a_{21}$: $A_{21} = a_{13} a_{32} - a_{12} a_{33}$
Example 22: Find minors and cofactors of the elements of the determinant $\begin{vmatrix} 2&−3&5\\6&0&4\\1&5&−7 \end{vmatrix}$ and verify that a11 A31 + a12 A32 + a13 A33= 0
Answer:
Given:
The determinant $\Delta = \begin{vmatrix} 2&−3&5\\6&0&4\\1&5&−7 \end{vmatrix}$.
To Find:
Minors and cofactors of all elements and verify $a_{11} A_{31} + a_{12} A_{32} + a_{13} A_{33}= 0$.
Solution:
The elements of the determinant are:
$a_{11}=2$, $a_{12}=-3$, $a_{13}=5$
$a_{21}=6$, $a_{22}=0$, $a_{23}=4$
$a_{31}=1$, $a_{32}=5$, $a_{33}=-7$
The Minor $M_{ij}$ of an element $a_{ij}$ is the determinant of the submatrix obtained by deleting the $i$-th row and $j$-th column.
The Cofactor $A_{ij}$ of an element $a_{ij}$ is given by $A_{ij} = (-1)^{i+j} M_{ij}$.
Calculating Minors:
$M_{11} = \begin{vmatrix} 0&4\\5&−7 \end{vmatrix} = (0)(-7) - (4)(5) = 0 - 20 = -20$
$M_{12} = \begin{vmatrix} 6&4\\1&−7 \end{vmatrix} = (6)(-7) - (4)(1) = -42 - 4 = -46$
$M_{13} = \begin{vmatrix} 6&0\\1&5 \end{vmatrix} = (6)(5) - (0)(1) = 30 - 0 = 30$
$M_{21} = \begin{vmatrix} −3&5\\5&−7 \end{vmatrix} = (-3)(-7) - (5)(5) = 21 - 25 = -4$
$M_{22} = \begin{vmatrix} 2&5\\1&−7 \end{vmatrix} = (2)(-7) - (5)(1) = -14 - 5 = -19$
$M_{23} = \begin{vmatrix} 2&−3\\1&5 \end{vmatrix} = (2)(5) - (-3)(1) = 10 - (-3) = 13$
$M_{31} = \begin{vmatrix} −3&5\\0&4 \end{vmatrix} = (-3)(4) - (5)(0) = -12 - 0 = -12$
$M_{32} = \begin{vmatrix} 2&5\\6&4 \end{vmatrix} = (2)(4) - (5)(6) = 8 - 30 = -22$
$M_{33} = \begin{vmatrix} 2&−3\\6&0 \end{vmatrix} = (2)(0) - (-3)(6) = 0 - (-18) = 18$
Calculating Cofactors:
$A_{11} = (-1)^{1+1} M_{11} = (1)(-20) = -20$
$A_{12} = (-1)^{1+2} M_{12} = (-1)(-46) = 46$
$A_{13} = (-1)^{1+3} M_{13} = (1)(30) = 30$
$A_{21} = (-1)^{2+1} M_{21} = (-1)(-4) = 4$
$A_{22} = (-1)^{2+2} M_{22} = (1)(-19) = -19$
$A_{23} = (-1)^{2+3} M_{23} = (-1)(13) = -13$
$A_{31} = (-1)^{3+1} M_{31} = (1)(-12) = -12$
$A_{32} = (-1)^{3+2} M_{32} = (-1)(-22) = 22$
$A_{33} = (-1)^{3+3} M_{33} = (1)(18) = 18$
Verification: $a_{11} A_{31} + a_{12} A_{32} + a_{13} A_{33}= 0$
We need to calculate the sum of the products of the elements of the first row ($a_{11}, a_{12}, a_{13}$) with the corresponding cofactors of the third row ($A_{31}, A_{32}, A_{33}$).
Left Hand Side (LHS) $= a_{11} A_{31} + a_{12} A_{32} + a_{13} A_{33}$
Substitute the values:
LHS $= (2)(-12) + (-3)(22) + (5)(18)$
LHS $= -24 - 66 + 90$
LHS $= -90 + 90$
LHS $= 0$
Right Hand Side (RHS) $= 0$
Since LHS = RHS, the identity is verified.
This confirms the property that the sum of the product of elements of a row (or a column) with the cofactors of corresponding elements of another row (or column) is zero.
Exercise 4.4
Write Minors and Cofactors of the elements of following determinants:
Question 1.
(i) $\begin{vmatrix} 2&−4\\0&3 \end{vmatrix}$
(ii) $\begin{vmatrix} a&c\\b&d \end{vmatrix}$
Answer:
The Minor $M_{ij}$ of an element $a_{ij}$ is the determinant of the submatrix obtained by deleting the $i$-th row and $j$-th column.
The Cofactor $A_{ij}$ of an element $a_{ij}$ is given by $A_{ij} = (-1)^{i+j} M_{ij}$.
(i) Determinant: $\begin{vmatrix} 2&−4\\0&3 \end{vmatrix}$
The elements are $a_{11}=2$, $a_{12}=-4$, $a_{21}=0$, and $a_{22}=3$.
Minors:
$M_{11} = \begin{vmatrix} 3 \end{vmatrix} = 3$
$M_{12} = \begin{vmatrix} 0 \end{vmatrix} = 0$
$M_{21} = \begin{vmatrix} -4 \end{vmatrix} = -4$
$M_{22} = \begin{vmatrix} 2 \end{vmatrix} = 2$
Cofactors:
$A_{11} = (-1)^{1+1} M_{11} = (-1)^2 \cdot 3 = 1 \cdot 3 = 3$
$A_{12} = (-1)^{1+2} M_{12} = (-1)^3 \cdot 0 = -1 \cdot 0 = 0$
$A_{21} = (-1)^{2+1} M_{21} = (-1)^3 \cdot (-4) = -1 \cdot (-4) = 4$
$A_{22} = (-1)^{2+2} M_{22} = (-1)^4 \cdot 2 = 1 \cdot 2 = 2$
(ii) Determinant: $\begin{vmatrix} a&c\\b&d \end{vmatrix}$
The elements are $a_{11}=a$, $a_{12}=c$, $a_{21}=b$, and $a_{22}=d$.
Minors:
$M_{11} = \begin{vmatrix} d \end{vmatrix} = d$
$M_{12} = \begin{vmatrix} b \end{vmatrix} = b$
$M_{21} = \begin{vmatrix} c \end{vmatrix} = c$
$M_{22} = \begin{vmatrix} a \end{vmatrix} = a$
Cofactors:
$A_{11} = (-1)^{1+1} M_{11} = (-1)^2 \cdot d = 1 \cdot d = d$
$A_{12} = (-1)^{1+2} M_{12} = (-1)^3 \cdot b = -1 \cdot b = -b$
$A_{21} = (-1)^{2+1} M_{21} = (-1)^3 \cdot c = -1 \cdot c = -c$
$A_{22} = (-1)^{2+2} M_{22} = (-1)^4 \cdot a = 1 \cdot a = a$
Question 2.
(i) $\begin{vmatrix} 1&0&0\\0&1&0\\0&0&1 \end{vmatrix}$
(ii) $\begin{vmatrix} 1&0&4\\3&5&−1\\0&1&2 \end{vmatrix}$
Answer:
The Minor $M_{ij}$ of an element $a_{ij}$ is the determinant of the submatrix obtained by deleting the $i$-th row and $j$-th column.
The Cofactor $A_{ij}$ of an element $a_{ij}$ is given by $A_{ij} = (-1)^{i+j} M_{ij}$.
(i) Determinant: $\begin{vmatrix} 1&0&0\\0&1&0\\0&0&1 \end{vmatrix}$
The elements are $a_{11}=1$, $a_{12}=0$, $a_{13}=0$, $a_{21}=0$, $a_{22}=1$, $a_{23}=0$, $a_{31}=0$, $a_{32}=0$, $a_{33}=1$.
Minors:
$M_{11} = \begin{vmatrix} 1&0\\0&1 \end{vmatrix} = (1)(1) - (0)(0) = 1$
$M_{12} = \begin{vmatrix} 0&0\\0&1 \end{vmatrix} = (0)(1) - (0)(0) = 0$
$M_{13} = \begin{vmatrix} 0&1\\0&0 \end{vmatrix} = (0)(0) - (1)(0) = 0$
$M_{21} = \begin{vmatrix} 0&0\\0&1 \end{vmatrix} = (0)(1) - (0)(0) = 0$
$M_{22} = \begin{vmatrix} 1&0\\0&1 \end{vmatrix} = (1)(1) - (0)(0) = 1$
$M_{23} = \begin{vmatrix} 1&0\\0&0 \end{vmatrix} = (1)(0) - (0)(0) = 0$
$M_{31} = \begin{vmatrix} 0&0\\1&0 \end{vmatrix} = (0)(0) - (0)(1) = 0$
$M_{32} = \begin{vmatrix} 1&0\\0&0 \end{vmatrix} = (1)(0) - (0)(0) = 0$
$M_{33} = \begin{vmatrix} 1&0\\0&1 \end{vmatrix} = (1)(1) - (0)(0) = 1$
Cofactors:
$A_{11} = (-1)^{1+1} M_{11} = 1 \cdot 1 = 1$
$A_{12} = (-1)^{1+2} M_{12} = -1 \cdot 0 = 0$
$A_{13} = (-1)^{1+3} M_{13} = 1 \cdot 0 = 0$
$A_{21} = (-1)^{2+1} M_{21} = -1 \cdot 0 = 0$
$A_{22} = (-1)^{2+2} M_{22} = 1 \cdot 1 = 1$
$A_{23} = (-1)^{2+3} M_{23} = -1 \cdot 0 = 0$
$A_{31} = (-1)^{3+1} M_{31} = 1 \cdot 0 = 0$
$A_{32} = (-1)^{3+2} M_{32} = -1 \cdot 0 = 0$
$A_{33} = (-1)^{3+3} M_{33} = 1 \cdot 1 = 1$
(ii) Determinant: $\begin{vmatrix} 1&0&4\\3&5&−1\\0&1&2 \end{vmatrix}$
The elements are $a_{11}=1$, $a_{12}=0$, $a_{13}=4$, $a_{21}=3$, $a_{22}=5$, $a_{23}=-1$, $a_{31}=0$, $a_{32}=1$, $a_{33}=2$.
Minors:
$M_{11} = \begin{vmatrix} 5&−1\\1&2 \end{vmatrix} = (5)(2) - (−1)(1) = 10 + 1 = 11$
$M_{12} = \begin{vmatrix} 3&−1\\0&2 \end{vmatrix} = (3)(2) - (−1)(0) = 6 - 0 = 6$
$M_{13} = \begin{vmatrix} 3&5\\0&1 \end{vmatrix} = (3)(1) - (5)(0) = 3 - 0 = 3$
$M_{21} = \begin{vmatrix} 0&4\\1&2 \end{vmatrix} = (0)(2) - (4)(1) = 0 - 4 = -4$
$M_{22} = \begin{vmatrix} 1&4\\0&2 \end{vmatrix} = (1)(2) - (4)(0) = 2 - 0 = 2$
$M_{23} = \begin{vmatrix} 1&0\\0&1 \end{vmatrix} = (1)(1) - (0)(0) = 1 - 0 = 1$
$M_{31} = \begin{vmatrix} 0&4\\5&−1 \end{vmatrix} = (0)(−1) - (4)(5) = 0 - 20 = -20$
$M_{32} = \begin{vmatrix} 1&4\\3&−1 \end{vmatrix} = (1)(−1) - (4)(3) = -1 - 12 = -13$
$M_{33} = \begin{vmatrix} 1&0\\3&5 \end{vmatrix} = (1)(5) - (0)(3) = 5 - 0 = 5$
Cofactors:
$A_{11} = (-1)^{1+1} M_{11} = 1 \cdot 11 = 11$
$A_{12} = (-1)^{1+2} M_{12} = -1 \cdot 6 = -6$
$A_{13} = (-1)^{1+3} M_{13} = 1 \cdot 3 = 3$
$A_{21} = (-1)^{2+1} M_{21} = -1 \cdot (-4) = 4$
$A_{22} = (-1)^{2+2} M_{22} = 1 \cdot 2 = 2$
$A_{23} = (-1)^{2+3} M_{23} = -1 \cdot 1 = -1$
$A_{31} = (-1)^{3+1} M_{31} = 1 \cdot (-20) = -20$
$A_{32} = (-1)^{3+2} M_{32} = -1 \cdot (-13) = 13$
$A_{33} = (-1)^{3+3} M_{33} = 1 \cdot 5 = 5$
Question 3. Using Cofactors of elements of second row, evaluate ∆ = $\begin{vmatrix} 5&3&8\\2&0&1\\1&2&3 \end{vmatrix}$ .
Answer:
Given:
The determinant $\Delta = \begin{vmatrix} 5&3&8\\2&0&1\\1&2&3 \end{vmatrix}$.
To Evaluate:
The value of the determinant using cofactors of the second row.
Solution:
The elements of the second row are $a_{21}=2$, $a_{22}=0$, and $a_{23}=1$.
The determinant $\Delta$ can be evaluated using the cofactors of the second row by the formula:
$\Delta = a_{21} A_{21} + a_{22} A_{22} + a_{23} A_{23}$
where $A_{ij}$ is the cofactor of the element $a_{ij}$.
First, we find the minors $M_{ij}$ for the elements in the second row:
$M_{21} = \begin{vmatrix} 3&8\\2&3 \end{vmatrix} = (3)(3) - (8)(2) = 9 - 16 = -7$
$M_{22} = \begin{vmatrix} 5&8\\1&3 \end{vmatrix} = (5)(3) - (8)(1) = 15 - 8 = 7$
$M_{23} = \begin{vmatrix} 5&3\\1&2 \end{vmatrix} = (5)(2) - (3)(1) = 10 - 3 = 7$
Now, we find the cofactors $A_{ij}$ using the formula $A_{ij} = (-1)^{i+j} M_{ij}$:
$A_{21} = (-1)^{2+1} M_{21} = (-1)^3 (-7) = -1 \cdot (-7) = 7$
$A_{22} = (-1)^{2+2} M_{22} = (-1)^4 (7) = 1 \cdot 7 = 7$
$A_{23} = (-1)^{2+3} M_{23} = (-1)^5 (7) = -1 \cdot 7 = -7$
Finally, substitute the elements of the second row and their cofactors into the determinant formula:
$\Delta = a_{21} A_{21} + a_{22} A_{22} + a_{23} A_{23}$
$\Delta = (2)(7) + (0)(7) + (1)(-7)$
$\Delta = 14 + 0 - 7$
$\Delta = 7$
The value of the determinant is 7.
Question 4. Using Cofactors of elements of third column, evaluate ∆ = $\begin{vmatrix} 1&x&yz\\1&y&zx\\1&z&xy \end{vmatrix}$ .
Answer:
Given:
The determinant $\Delta = \begin{vmatrix} 1&x&yz\\1&y&zx\\1&z&xy \end{vmatrix}$.
To Evaluate:
The value of the determinant using cofactors of the third column.
Solution:
The elements of the third column are $a_{13}=yz$, $a_{23}=zx$, and $a_{33}=xy$.
The determinant $\Delta$ can be evaluated using the cofactors of the third column by the formula:
$\Delta = a_{13} A_{13} + a_{23} A_{23} + a_{33} A_{33}$
where $A_{ij}$ is the cofactor of the element $a_{ij}$, given by $A_{ij} = (-1)^{i+j} M_{ij}$.
First, we find the minors $M_{ij}$ for the elements in the third column:
$M_{13} = \begin{vmatrix} 1&y\\1&z \end{vmatrix} = (1)(z) - (y)(1) = z - y$
$M_{23} = \begin{vmatrix} 1&x\\1&z \end{vmatrix} = (1)(z) - (x)(1) = z - x$
$M_{33} = \begin{vmatrix} 1&x\\1&y \end{vmatrix} = (1)(y) - (x)(1) = y - x$
Now, we find the cofactors $A_{ij}$:
$A_{13} = (-1)^{1+3} M_{13} = (-1)^4 (z - y) = 1 \cdot (z - y) = z - y$
$A_{23} = (-1)^{2+3} M_{23} = (-1)^5 (z - x) = -1 \cdot (z - x) = x - z$
$A_{33} = (-1)^{3+3} M_{33} = (-1)^6 (y - x) = 1 \cdot (y - x) = y - x$
Finally, substitute the elements of the third column and their cofactors into the determinant formula:
$\Delta = a_{13} A_{13} + a_{23} A_{23} + a_{33} A_{33}$
$\Delta = (yz)(z - y) + (zx)(x - z) + (xy)(y - x)$
Expand and simplify the expression:
$\Delta = yz^2 - y^2z + zx^2 - z^2x + xy^2 - x^2y$
Rearrange the terms, typically in cyclic order $(x-y)(y-z)(z-x)$:
$\Delta = -x^2y + x^2z + xy^2 - y^2z - xz^2 + yz^2$
Group terms with common factors:
$\Delta = x^2(z - y) + x(y^2 - z^2) + yz(z - y)$
Factor $y^2 - z^2$ as $(y-z)(y+z)$:
$\Delta = x^2(z - y) + x(y-z)(y+z) + yz(z - y)$
Rewrite $(y-z)$ as $-(z-y)$:
$\Delta = x^2(z - y) - x(z-y)(y+z) + yz(z - y)$
Factor out the common term $(z - y)$:
$\Delta = (z - y) [x^2 - x(y+z) + yz]$
Simplify the expression inside the bracket:
$\Delta = (z - y) [x^2 - xy - xz + yz]$
Factor the quadratic expression by grouping terms:
$x^2 - xy - xz + yz = x(x - y) - z(x - y) = (x - y)(x - z)$
Substitute this back into the expression for $\Delta$:
$\Delta = (z - y)(x - y)(x - z)$
To write this in the standard cyclic form $(x-y)(y-z)(z-x)$, we rearrange the terms. $(z-y) = -(y-z)$ and $(x-z) = -(z-x)$.
$\Delta = -(y-z) \cdot (x-y) \cdot -(z-x)$
$\Delta = (-1)(-1) (x-y)(y-z)(z-x)$
$\Delta = (x-y)(y-z)(z-x)$
The value of the determinant is $(x-y)(y-z)(z-x)$.
Question 5. If ∆ = $\begin{vmatrix} a_{11}&a_{12}&a_{13}\\a_{21}&a_{22}&a_{23}\\a_{31}&a_{32}&a_{33} \end{vmatrix}$ and Aij is Cofactors of aij , then value of ∆ is given by
(A) a11 A31+ a12 A32 + a13 A33
(B) a11 A11+ a12 A21 + a13 A31
(C) a21 A11+ a22 A12 + a23 A13
(D) a11 A11+ a21 A21 + a31 A31
Answer:
Solution:
The value of a determinant can be calculated by summing the products of the elements of any one row (or column) with their corresponding cofactors.
Let the determinant be $\Delta$. For a 3x3 matrix, the value of the determinant can be expanded along any row $i$ as:
$\Delta = a_{i1}A_{i1} + a_{i2}A_{i2} + a_{i3}A_{i3}$
or along any column $j$ as:
$\Delta = a_{1j}A_{1j} + a_{2j}A_{2j} + a_{3j}A_{3j}$
Let's examine the given options:
(A) $a_{11} A_{31} + a_{12} A_{32} + a_{13} A_{33}$ : This is the sum of the product of elements of the first row ($a_{11}, a_{12}, a_{13}$) with the cofactors of the third row ($A_{31}, A_{32}, A_{33}$). The sum of the product of elements of a row (or column) with the cofactors of another row (or column) is always zero.
(B) $a_{11} A_{11} + a_{12} A_{21} + a_{13} A_{31}$ : This is the sum of elements of the first row ($a_{11}, a_{12}, a_{13}$) with the cofactors of the first column ($A_{11}, A_{21}, A_{31}$). This is not a valid expansion.
(C) $a_{21} A_{11} + a_{22} A_{12} + a_{23} A_{13}$ : This is the sum of elements of the second row ($a_{21}, a_{22}, a_{23}$) with the cofactors of the first row ($A_{11}, A_{12}, A_{13}$). This sum is also zero.
(D) $a_{11} A_{11} + a_{21} A_{21} + a_{31} A_{31}$ : This is the sum of the product of elements of the first column ($a_{11}, a_{21}, a_{31}$) with their corresponding cofactors ($A_{11}, A_{21}, A_{31}$). This is a valid expansion along the first column.
Therefore, the correct expression for the value of $\Delta$ is the sum of the product of elements of a row (or column) with their corresponding cofactors.
Option (D) represents the expansion along the first column.
$\Delta = a_{11}A_{11} + a_{21}A_{21} + a_{31}A_{31}$
The correct option is (D) $a_{11} A_{11}+ a_{21} A_{21} + a_{31} A_{31}$.
Example 23 to 26 (Before Exercise 4.5)
Example 23: Find A for A = $\begin{bmatrix}2&3\\1&4 \end{bmatrix}$
Answer:
Given:
The matrix $A = \begin{bmatrix}2&3\\1&4 \end{bmatrix}$.
To Find:
The determinant of matrix A, denoted as $|A|$ or $\det(A)$. (The question likely implies finding the determinant).
Solution:
For a $2 \times 2$ matrix $M = \begin{bmatrix} a & b \\ c & d \end{bmatrix}$, the determinant is calculated as $|M| = ad - bc$.
In the given matrix $A = \begin{bmatrix}2&3\\1&4 \end{bmatrix}$, we have $a=2$, $b=3$, $c=1$, and $d=4$.
Using the formula:
$|A| = (2)(4) - (3)(1)$
$|A| = 8 - 3$
$|A| = 5$
The value of the determinant of matrix A is 5.
Example 24: If A = $\begin{bmatrix}1&3&3\\1&4&3\\1&3&4 \end{bmatrix}$ , then verify that A adj A = | A| I. Also find A–1.
Answer:
Given:
$A = \begin{bmatrix}1&3&3\\1&4&3\\1&3&4 \end{bmatrix}$
To Verify:
$A \text{ adj } A = |A| I$
To Find:
$A^{-1}$
Solution:
First, we find the determinant of the matrix A.
$|A| = \det(A) = \begin{vmatrix}1&3&3\\1&4&3\\1&3&4 \end{vmatrix}$
Expanding along the first row:
$|A| = 1 \begin{vmatrix} 4 & 3 \\ 3 & 4 \end{vmatrix} - 3 \begin{vmatrix} 1 & 3 \\ 1 & 4 \end{vmatrix} + 3 \begin{vmatrix} 1 & 4 \\ 1 & 3 \end{vmatrix}$
$|A| = 1(4 \times 4 - 3 \times 3) - 3(1 \times 4 - 3 \times 1) + 3(1 \times 3 - 4 \times 1)$
$|A| = 1(16 - 9) - 3(4 - 3) + 3(3 - 4)$
$|A| = 1(7) - 3(1) + 3(-1)$
$|A| = 7 - 3 - 3 = 1$
So, $|A| = 1$. Since $|A| \neq 0$, the matrix A is invertible.
Next, we find the adjoint of A. First, we calculate the matrix of cofactors.
$C_{11} = (-1)^{1+1} \begin{vmatrix} 4 & 3 \\ 3 & 4 \end{vmatrix} = (1)(16 - 9) = 7$
$C_{12} = (-1)^{1+2} \begin{vmatrix} 1 & 3 \\ 1 & 4 \end{vmatrix} = (-1)(4 - 3) = -1$
$C_{13} = (-1)^{1+3} \begin{vmatrix} 1 & 4 \\ 1 & 3 \end{vmatrix} = (1)(3 - 4) = -1$
$C_{21} = (-1)^{2+1} \begin{vmatrix} 3 & 3 \\ 3 & 4 \end{vmatrix} = (-1)(12 - 9) = -3$
$C_{22} = (-1)^{2+2} \begin{vmatrix} 1 & 3 \\ 1 & 4 \end{vmatrix} = (1)(4 - 3) = 1$
$C_{23} = (-1)^{2+3} \begin{vmatrix} 1 & 3 \\ 1 & 3 \end{vmatrix} = (-1)(3 - 3) = 0$
$C_{31} = (-1)^{3+1} \begin{vmatrix} 3 & 3 \\ 4 & 3 \end{vmatrix} = (1)(9 - 12) = -3$
$C_{32} = (-1)^{3+2} \begin{vmatrix} 1 & 3 \\ 1 & 3 \end{vmatrix} = (-1)(3 - 3) = 0$
$C_{33} = (-1)^{3+3} \begin{vmatrix} 1 & 3 \\ 1 & 4 \end{vmatrix} = (1)(4 - 3) = 1$
The matrix of cofactors is:
$\text{Cofactor}(A) = \begin{bmatrix} 7 & -1 & -1 \\ -3 & 1 & 0 \\ -3 & 0 & 1 \end{bmatrix}$
The adjoint of A is the transpose of the matrix of cofactors:
$\text{adj } A = (\text{Cofactor}(A))^T = \begin{bmatrix} 7 & -3 & -3 \\ -1 & 1 & 0 \\ -1 & 0 & 1 \end{bmatrix}$
Now, we verify the property $A \text{ adj } A = |A| I$.
First, calculate $|A| I$:
$|A| I = 1 \cdot \begin{bmatrix} 1 & 0 & 0 \\ 0 & 1 & 0 \\ 0 & 0 & 1 \end{bmatrix} = \begin{bmatrix} 1 & 0 & 0 \\ 0 & 1 & 0 \\ 0 & 0 & 1 \end{bmatrix}$
Next, calculate $A \text{ adj } A$:
$A \text{ adj } A = \begin{bmatrix}1&3&3\\1&4&3\\1&3&4 \end{bmatrix} \begin{bmatrix} 7 & -3 & -3 \\ -1 & 1 & 0 \\ -1 & 0 & 1 \end{bmatrix}$
Multiplying the matrices:
$A \text{ adj } A = \begin{bmatrix} 1(7)+3(-1)+3(-1) & 1(-3)+3(1)+3(0) & 1(-3)+3(0)+3(1) \\ 1(7)+4(-1)+3(-1) & 1(-3)+4(1)+3(0) & 1(-3)+4(0)+3(1) \\ 1(7)+3(-1)+4(-1) & 1(-3)+3(1)+4(0) & 1(-3)+3(0)+4(1) \end{bmatrix}$
$A \text{ adj } A = \begin{bmatrix} 7-3-3 & -3+3+0 & -3+0+3 \\ 7-4-3 & -3+4+0 & -3+0+3 \\ 7-3-4 & -3+3+0 & -3+0+4 \end{bmatrix}$
$A \text{ adj } A = \begin{bmatrix} 1 & 0 & 0 \\ 0 & 1 & 0 \\ 0 & 0 & 1 \end{bmatrix}$
Since $A \text{ adj } A = \begin{bmatrix} 1 & 0 & 0 \\ 0 & 1 & 0 \\ 0 & 0 & 1 \end{bmatrix}$ and $|A| I = \begin{bmatrix} 1 & 0 & 0 \\ 0 & 1 & 0 \\ 0 & 0 & 1 \end{bmatrix}$, we have verified that $A \text{ adj } A = |A| I$.
Finally, we find the inverse of A using the formula $A^{-1} = \frac{1}{|A|} \text{ adj } A$.
Since $|A| = 1$ and $\text{adj } A = \begin{bmatrix} 7 & -3 & -3 \\ -1 & 1 & 0 \\ -1 & 0 & 1 \end{bmatrix}$,
$A^{-1} = \frac{1}{1} \begin{bmatrix} 7 & -3 & -3 \\ -1 & 1 & 0 \\ -1 & 0 & 1 \end{bmatrix}$
$A^{-1} = \begin{bmatrix} 7 & -3 & -3 \\ -1 & 1 & 0 \\ -1 & 0 & 1 \end{bmatrix}$
Example 25: If A = $\begin{bmatrix}2&3\\1&−4 \end{bmatrix}$ and B = $\begin{bmatrix}1&−2\\−1&3 \end{bmatrix}$ , then verify that (AB)-1 = B-1A-1
Answer:
Given:
$A = \begin{bmatrix}2&3\\1&−4 \end{bmatrix}$
$B = \begin{bmatrix}1&−2\\−1&3 \end{bmatrix}$
To Verify:
$(AB)^{-1} = B^{-1}A^{-1}$
Solution:
First, we calculate the product matrix AB.
$AB = \begin{bmatrix}2&3\\1&−4 \end{bmatrix} \begin{bmatrix}1&−2\\−1&3 \end{bmatrix}$
$AB = \begin{bmatrix} (2)(1)+(3)(-1) & (2)(-2)+(3)(3) \\ (1)(1)+(-4)(-1) & (1)(-2)+(-4)(3) \end{bmatrix}$
$AB = \begin{bmatrix} 2-3 & -4+9 \\ 1+4 & -2-12 \end{bmatrix}$
$AB = \begin{bmatrix} -1 & 5 \\ 5 & -14 \end{bmatrix}$
Next, we find the inverse of AB, $(AB)^{-1}$. We need the determinant and the adjoint of AB.
$\det(AB) = (-1)(-14) - (5)(5) = 14 - 25 = -11$
Since $\det(AB) \neq 0$, $(AB)$ is invertible.
The adjoint of a $2 \times 2$ matrix $\begin{bmatrix} a & b \\ c & d \end{bmatrix}$ is $\begin{bmatrix} d & -b \\ -c & a \end{bmatrix}$.
$\text{adj}(AB) = \text{adj}\begin{bmatrix} -1 & 5 \\ 5 & -14 \end{bmatrix} = \begin{bmatrix} -14 & -5 \\ -5 & -1 \end{bmatrix}$
So, $(AB)^{-1} = \frac{1}{\det(AB)} \text{adj}(AB) = \frac{1}{-11} \begin{bmatrix} -14 & -5 \\ -5 & -1 \end{bmatrix}$
$(AB)^{-1} = \begin{bmatrix} \frac{-14}{-11} & \frac{-5}{-11} \\ \frac{-5}{-11} & \frac{-1}{-11} \end{bmatrix} = \begin{bmatrix} \frac{14}{11} & \frac{5}{11} \\ \frac{5}{11} & \frac{1}{11} \end{bmatrix}$ ... (i)
Now, we find the inverses of A and B, $A^{-1}$ and $B^{-1}$.
For matrix A:
$A = \begin{bmatrix}2&3\\1&−4 \end{bmatrix}$
$\det(A) = (2)(-4) - (3)(1) = -8 - 3 = -11$
Since $\det(A) \neq 0$, A is invertible.
$\text{adj}(A) = \begin{bmatrix} -4 & -3 \\ -1 & 2 \end{bmatrix}$
$A^{-1} = \frac{1}{\det(A)} \text{adj}(A) = \frac{1}{-11} \begin{bmatrix} -4 & -3 \\ -1 & 2 \end{bmatrix} = \begin{bmatrix} \frac{4}{11} & \frac{3}{11} \\ \frac{1}{11} & -\frac{2}{11} \end{bmatrix}$
For matrix B:
$B = \begin{bmatrix}1&−2\\−1&3 \end{bmatrix}$
$\det(B) = (1)(3) - (-2)(-1) = 3 - 2 = 1$
Since $\det(B) \neq 0$, B is invertible.
$\text{adj}(B) = \begin{bmatrix} 3 & 2 \\ 1 & 1 \end{bmatrix}$
$B^{-1} = \frac{1}{\det(B)} \text{adj}(B) = \frac{1}{1} \begin{bmatrix} 3 & 2 \\ 1 & 1 \end{bmatrix} = \begin{bmatrix} 3 & 2 \\ 1 & 1 \end{bmatrix}$
Finally, we calculate $B^{-1}A^{-1}$.
$B^{-1}A^{-1} = \begin{bmatrix} 3 & 2 \\ 1 & 1 \end{bmatrix} \begin{bmatrix} \frac{4}{11} & \frac{3}{11} \\ \frac{1}{11} & -\frac{2}{11} \end{bmatrix}$
$B^{-1}A^{-1} = \begin{bmatrix} (3)(\frac{4}{11})+(2)(\frac{1}{11}) & (3)(\frac{3}{11})+(2)(-\frac{2}{11}) \\ (1)(\frac{4}{11})+(1)(\frac{1}{11}) & (1)(\frac{3}{11})+(1)(-\frac{2}{11}) \end{bmatrix}$
$B^{-1}A^{-1} = \begin{bmatrix} \frac{12}{11}+\frac{2}{11} & \frac{9}{11}-\frac{4}{11} \\ \frac{4}{11}+\frac{1}{11} & \frac{3}{11}-\frac{2}{11} \end{bmatrix}$
$B^{-1}A^{-1} = \begin{bmatrix} \frac{14}{11} & \frac{5}{11} \\ \frac{5}{11} & \frac{1}{11} \end{bmatrix}$ ... (ii)
Comparing the results from (i) and (ii), we see that $(AB)^{-1} = B^{-1}A^{-1}$.
Thus, the property is verified.
Example 26: Show that the matrix A = $\begin{bmatrix}2&3\\1&2 \end{bmatrix}$ satisfies the equation A2 – 4A + I = O, where I is 2 × 2 identity matrix and O is 2 × 2 zero matrix. Using this equation, find A–1.
Answer:
Given:
$A = \begin{bmatrix}2&3\\1&2 \end{bmatrix}$
I is the $2 \times 2$ identity matrix, $I = \begin{bmatrix}1&0\\0&1 \end{bmatrix}$
O is the $2 \times 2$ zero matrix, $O = \begin{bmatrix}0&0\\0&0 \end{bmatrix}$
To Show:
$A^2 - 4A + I = O$
To Find:
$A^{-1}$ using the given equation.
Solution:
First, we calculate $A^2$.
$A^2 = A \cdot A = \begin{bmatrix}2&3\\1&2 \end{bmatrix} \begin{bmatrix}2&3\\1&2 \end{bmatrix}$
$A^2 = \begin{bmatrix} (2)(2)+(3)(1) & (2)(3)+(3)(2) \\ (1)(2)+(2)(1) & (1)(3)+(2)(2) \end{bmatrix}$
$A^2 = \begin{bmatrix} 4+3 & 6+6 \\ 2+2 & 3+4 \end{bmatrix} = \begin{bmatrix} 7 & 12 \\ 4 & 7 \end{bmatrix}$
Next, we calculate $4A$.
$4A = 4 \begin{bmatrix}2&3\\1&2 \end{bmatrix} = \begin{bmatrix} 4(2) & 4(3) \\ 4(1) & 4(2) \end{bmatrix} = \begin{bmatrix} 8 & 12 \\ 4 & 8 \end{bmatrix}$
Now, we substitute the values into the expression $A^2 - 4A + I$:
$A^2 - 4A + I = \begin{bmatrix} 7 & 12 \\ 4 & 7 \end{bmatrix} - \begin{bmatrix} 8 & 12 \\ 4 & 8 \end{bmatrix} + \begin{bmatrix} 1 & 0 \\ 0 & 1 \end{bmatrix}$
$A^2 - 4A + I = \begin{bmatrix} 7-8 & 12-12 \\ 4-4 & 7-8 \end{bmatrix} + \begin{bmatrix} 1 & 0 \\ 0 & 1 \end{bmatrix}$
$A^2 - 4A + I = \begin{bmatrix} -1 & 0 \\ 0 & -1 \end{bmatrix} + \begin{bmatrix} 1 & 0 \\ 0 & 1 \end{bmatrix}$
$A^2 - 4A + I = \begin{bmatrix} -1+1 & 0+0 \\ 0+0 & -1+1 \end{bmatrix} = \begin{bmatrix} 0 & 0 \\ 0 & 0 \end{bmatrix}$
Since $\begin{bmatrix} 0 & 0 \\ 0 & 0 \end{bmatrix}$ is the zero matrix O, we have shown that $A^2 - 4A + I = O$.
Now, we use the equation $A^2 - 4A + I = O$ to find $A^{-1}$.
We know that $A$ is invertible because $|A| = (2)(2) - (3)(1) = 4 - 3 = 1 \neq 0$.
Multiply the equation by $A^{-1}$ from the left:
$A^{-1}(A^2 - 4A + I) = A^{-1}O$
Using the distributive property and the properties of inverse and identity matrices ($A^{-1}A = I$, $IA = A$, $A^{-1}I = A^{-1}$, $A^{-1}O = O$):
$A^{-1}A^2 - A^{-1}(4A) + A^{-1}I = O$
$(A^{-1}A)A - 4(A^{-1}A) + A^{-1} = O$
$IA - 4I + A^{-1} = O$
$A - 4I + A^{-1} = O$
Now, we solve for $A^{-1}$:
$A^{-1} = 4I - A$
Finally, we calculate $4I - A$:
$A^{-1} = 4 \begin{bmatrix}1&0\\0&1 \end{bmatrix} - \begin{bmatrix}2&3\\1&2 \end{bmatrix}$
$A^{-1} = \begin{bmatrix} 4(1) & 4(0) \\ 4(0) & 4(1) \end{bmatrix} - \begin{bmatrix}2&3\\1&2 \end{bmatrix}$
$A^{-1} = \begin{bmatrix} 4 & 0 \\ 0 & 4 \end{bmatrix} - \begin{bmatrix}2&3\\1&2 \end{bmatrix}$
$A^{-1} = \begin{bmatrix} 4-2 & 0-3 \\ 0-1 & 4-2 \end{bmatrix}$
$A^{-1} = \begin{bmatrix} 2 & -3 \\ -1 & 2 \end{bmatrix}$
Exercise 4.5
Find adjoint of each of the matrices in Exercises 1 and 2.
Question 1. $\begin{bmatrix}1&2\\3&4 \end{bmatrix}$
Answer:
Given:
Let the given matrix be A.
$A = \begin{bmatrix}1&2\\3&4 \end{bmatrix}$
To Find:
Adjoint of matrix A (adj A).
Solution:
For a $2 \times 2$ matrix $M = \begin{bmatrix}a&b\\c&d \end{bmatrix}$, the adjoint of M is given by interchanging the diagonal elements and changing the sign of the off-diagonal elements.
$\text{adj } M = \begin{bmatrix}d&-b\\-c&a \end{bmatrix}$
In our case, for matrix $A = \begin{bmatrix}1&2\\3&4 \end{bmatrix}$, we have $a=1$, $b=2$, $c=3$, and $d=4$.
Applying the formula for the adjoint of a $2 \times 2$ matrix:
$\text{adj } A = \begin{bmatrix}4&-2\\-3&1 \end{bmatrix}$
Question 2. $\begin{bmatrix}1&−1&2\\2&3&5\\−2&0&1 \end{bmatrix}$
Answer:
Given:
Let the given matrix be A.
$A = \begin{bmatrix}1&-1&2\\2&3&5\\-2&0&1 \end{bmatrix}$
To Find:
Adjoint of matrix A (adj A).
Solution:
The adjoint of a matrix A is the transpose of the matrix of its cofactors.
First, we calculate the cofactors $C_{ij} = (-1)^{i+j} M_{ij}$, where $M_{ij}$ is the minor of the element $a_{ij}$.
$C_{11} = (-1)^{1+1} \begin{vmatrix} 3 & 5 \\ 0 & 1 \end{vmatrix} = (1)((3)(1) - (5)(0)) = 3 - 0 = 3$
$C_{12} = (-1)^{1+2} \begin{vmatrix} 2 & 5 \\ -2 & 1 \end{vmatrix} = (-1)((2)(1) - (5)(-2)) = (-1)(2 + 10) = -12$
$C_{13} = (-1)^{1+3} \begin{vmatrix} 2 & 3 \\ -2 & 0 \end{vmatrix} = (1)((2)(0) - (3)(-2)) = 0 + 6 = 6$
$C_{21} = (-1)^{2+1} \begin{vmatrix} -1 & 2 \\ 0 & 1 \end{vmatrix} = (-1)((-1)(1) - (2)(0)) = (-1)(-1 - 0) = 1$
$C_{22} = (-1)^{2+2} \begin{vmatrix} 1 & 2 \\ -2 & 1 \end{vmatrix} = (1)((1)(1) - (2)(-2)) = 1 + 4 = 5$
$C_{23} = (-1)^{2+3} \begin{vmatrix} 1 & -1 \\ -2 & 0 \end{vmatrix} = (-1)((1)(0) - (-1)(-2)) = (-1)(0 - 2) = 2$
$C_{31} = (-1)^{3+1} \begin{vmatrix} -1 & 2 \\ 3 & 5 \end{vmatrix} = (1)((-1)(5) - (2)(3)) = -5 - 6 = -11$
$C_{32} = (-1)^{3+2} \begin{vmatrix} 1 & 2 \\ 2 & 5 \end{vmatrix} = (-1)((1)(5) - (2)(2)) = (-1)(5 - 4) = -1$
$C_{33} = (-1)^{3+3} \begin{vmatrix} 1 & -1 \\ 2 & 3 \end{vmatrix} = (1)((1)(3) - (-1)(2)) = 3 + 2 = 5$
The matrix of cofactors is:
$\text{Cofactor}(A) = \begin{bmatrix} 3 & -12 & 6 \\ 1 & 5 & 2 \\ -11 & -1 & 5 \end{bmatrix}$
The adjoint of A is the transpose of the matrix of cofactors:
$\text{adj } A = (\text{Cofactor}(A))^T = \begin{bmatrix} 3 & 1 & -11 \\ -12 & 5 & -1 \\ 6 & 2 & 5 \end{bmatrix}$
Verify A (adj A) = (adj A) A = | A | I in Exercises 3 and 4
Question 3. $\begin{bmatrix}2&3\\−4&−6 \end{bmatrix}$
Answer:
Given:
Let the given matrix be A.
$A = \begin{bmatrix}2&3\\-4&-6 \end{bmatrix}$
To Verify:
$A (\text{adj } A) = (\text{adj } A) A = |A| I$
Solution:
First, we calculate the determinant of A.
$|A| = \det(A) = (2)(-6) - (3)(-4)$
$|A| = -12 - (-12)$
$|A| = -12 + 12 = 0$
Next, we find the adjoint of A. For a $2 \times 2$ matrix $\begin{bmatrix}a&b\\c&d \end{bmatrix}$, the adjoint is $\begin{bmatrix}d&-b\\-c&a \end{bmatrix}$.
$A = \begin{bmatrix}2&3\\-4&-6 \end{bmatrix}$
$\text{adj } A = \begin{bmatrix}-6&-3\\-(-4)&2 \end{bmatrix} = \begin{bmatrix}-6&-3\\4&2 \end{bmatrix}$
Now, we calculate $A (\text{adj } A)$.
$A (\text{adj } A) = \begin{bmatrix}2&3\\-4&-6 \end{bmatrix} \begin{bmatrix}-6&-3\\4&2 \end{bmatrix}$
$A (\text{adj } A) = \begin{bmatrix} (2)(-6)+(3)(4) & (2)(-3)+(3)(2) \\ (-4)(-6)+(-6)(4) & (-4)(-3)+(-6)(2) \end{bmatrix}$
$A (\text{adj } A) = \begin{bmatrix} -12+12 & -6+6 \\ 24-24 & 12-12 \end{bmatrix} = \begin{bmatrix} 0 & 0 \\ 0 & 0 \end{bmatrix}$ ... (i)
Next, we calculate $(\text{adj } A) A$.
$(\text{adj } A) A = \begin{bmatrix}-6&-3\\4&2 \end{bmatrix} \begin{bmatrix}2&3\\-4&-6 \end{bmatrix}$
$(\text{adj } A) A = \begin{bmatrix} (-6)(2)+(-3)(-4) & (-6)(3)+(-3)(-6) \\ (4)(2)+(2)(-4) & (4)(3)+(2)(-6) \end{bmatrix}$
$(\text{adj } A) A = \begin{bmatrix} -12+12 & -18+18 \\ 8-8 & 12-12 \end{bmatrix} = \begin{bmatrix} 0 & 0 \\ 0 & 0 \end{bmatrix}$ ... (ii)
Finally, we calculate $|A| I$. Here I is the $2 \times 2$ identity matrix, $I = \begin{bmatrix}1&0\\0&1 \end{bmatrix}$.
$|A| I = 0 \cdot \begin{bmatrix}1&0\\0&1 \end{bmatrix} = \begin{bmatrix}0 \cdot 1 & 0 \cdot 0\\0 \cdot 0 & 0 \cdot 1 \end{bmatrix} = \begin{bmatrix}0&0\\0&0 \end{bmatrix}$ ... (iii)
From (i), (ii), and (iii), we see that $A (\text{adj } A) = \begin{bmatrix} 0 & 0 \\ 0 & 0 \end{bmatrix}$, $(\text{adj } A) A = \begin{bmatrix} 0 & 0 \\ 0 & 0 \end{bmatrix}$, and $|A| I = \begin{bmatrix} 0 & 0 \\ 0 & 0 \end{bmatrix}$.
Therefore, $A (\text{adj } A) = (\text{adj } A) A = |A| I$ is verified.
Question 4. $\begin{bmatrix}1&−1&2\\3&0&−2\\1&0&3 \end{bmatrix}$
Answer:
Given:
Let the given matrix be A.
$A = \begin{bmatrix}1&-1&2\\3&0&-2\\1&0&3 \end{bmatrix}$
To Verify:
$A (\text{adj } A) = (\text{adj } A) A = |A| I$
Solution:
First, we calculate the determinant of A. Expanding along the second column (since it contains zeros):
$|A| = \det(A) = (-1) \cdot C_{12} + 0 \cdot C_{22} + 0 \cdot C_{32}$
$C_{12} = (-1)^{1+2} \begin{vmatrix} 3 & -2 \\ 1 & 3 \end{vmatrix} = (-1)((3)(3) - (-2)(1)) = (-1)(9 + 2) = -11$
$|A| = (-1)(-11) = 11$
Now, we find the adjoint of A by finding the transpose of the matrix of cofactors.
$C_{11} = (-1)^{1+1} \begin{vmatrix} 0 & -2 \\ 0 & 3 \end{vmatrix} = (1)(0 - 0) = 0$
$C_{12} = (-1)^{1+2} \begin{vmatrix} 3 & -2 \\ 1 & 3 \end{vmatrix} = (-1)(9 + 2) = -11$
$C_{13} = (-1)^{1+3} \begin{vmatrix} 3 & 0 \\ 1 & 0 \end{vmatrix} = (1)(0 - 0) = 0$
$C_{21} = (-1)^{2+1} \begin{vmatrix} -1 & 2 \\ 0 & 3 \end{vmatrix} = (-1)(-3 - 0) = 3$
$C_{22} = (-1)^{2+2} \begin{vmatrix} 1 & 2 \\ 1 & 3 \end{vmatrix} = (1)(3 - 2) = 1$
$C_{23} = (-1)^{2+3} \begin{vmatrix} 1 & -1 \\ 1 & 0 \end{vmatrix} = (-1)(0 + 1) = -1$
$C_{31} = (-1)^{3+1} \begin{vmatrix} -1 & 2 \\ 0 & -2 \end{vmatrix} = (1)(2 - 0) = 2$
$C_{32} = (-1)^{3+2} \begin{vmatrix} 1 & 2 \\ 3 & -2 \end{vmatrix} = (-1)(-2 - 6) = 8$
$C_{33} = (-1)^{3+3} \begin{vmatrix} 1 & -1 \\ 3 & 0 \end{vmatrix} = (1)(0 + 3) = 3$
The matrix of cofactors is:
$\text{Cofactor}(A) = \begin{bmatrix} 0 & -11 & 0 \\ 3 & 1 & -1 \\ 2 & 8 & 3 \end{bmatrix}$
The adjoint of A is the transpose of the matrix of cofactors:
$\text{adj } A = (\text{Cofactor}(A))^T = \begin{bmatrix} 0 & 3 & 2 \\ -11 & 1 & 8 \\ 0 & -1 & 3 \end{bmatrix}$
Now, we calculate $A (\text{adj } A)$.
$A (\text{adj } A) = \begin{bmatrix}1&-1&2\\3&0&-2\\1&0&3 \end{bmatrix} \begin{bmatrix} 0 & 3 & 2 \\ -11 & 1 & 8 \\ 0 & -1 & 3 \end{bmatrix}$
$A (\text{adj } A) = \begin{bmatrix} (1)(0)+(-1)(-11)+(2)(0) & (1)(3)+(-1)(1)+(2)(-1) & (1)(2)+(-1)(8)+(2)(3) \\ (3)(0)+(0)(-11)+(-2)(0) & (3)(3)+(0)(1)+(-2)(-1) & (3)(2)+(0)(8)+(-2)(3) \\ (1)(0)+(0)(-11)+(3)(0) & (1)(3)+(0)(1)+(3)(-1) & (1)(2)+(0)(8)+(3)(3) \end{bmatrix}$
$A (\text{adj } A) = \begin{bmatrix} 0+11+0 & 3-1-2 & 2-8+6 \\ 0+0+0 & 9+0+2 & 6+0-6 \\ 0+0+0 & 3+0-3 & 2+0+9 \end{bmatrix}$
$A (\text{adj } A) = \begin{bmatrix} 11 & 0 & 0 \\ 0 & 11 & 0 \\ 0 & 0 & 11 \end{bmatrix}$ ... (i)
Next, we calculate $(\text{adj } A) A$.
$(\text{adj } A) A = \begin{bmatrix} 0 & 3 & 2 \\ -11 & 1 & 8 \\ 0 & -1 & 3 \end{bmatrix} \begin{bmatrix}1&-1&2\\3&0&-2\\1&0&3 \end{bmatrix}$
$(\text{adj } A) A = \begin{bmatrix} (0)(1)+(3)(3)+(2)(1) & (0)(-1)+(3)(0)+(2)(0) & (0)(2)+(3)(-2)+(2)(3) \\ (-11)(1)+(1)(3)+(8)(1) & (-11)(-1)+(1)(0)+(8)(0) & (-11)(2)+(1)(-2)+(8)(3) \\ (0)(1)+(-1)(3)+(3)(1) & (0)(-1)+(-1)(0)+(3)(0) & (0)(2)+(-1)(-2)+(3)(3) \end{bmatrix}$
$(\text{adj } A) A = \begin{bmatrix} 0+9+2 & 0+0+0 & 0-6+6 \\ -11+3+8 & 11+0+0 & -22-2+24 \\ 0-3+3 & 0+0+0 & 0+2+9 \end{bmatrix}$
$(\text{adj } A) A = \begin{bmatrix} 11 & 0 & 0 \\ 0 & 11 & 0 \\ 0 & 0 & 11 \end{bmatrix}$ ... (ii)
Finally, we calculate $|A| I$. Here I is the $3 \times 3$ identity matrix, $I = \begin{bmatrix}1&0&0\\0&1&0\\0&0&1 \end{bmatrix}$.
$|A| I = 11 \cdot \begin{bmatrix}1&0&0\\0&1&0\\0&0&1 \end{bmatrix} = \begin{bmatrix}11 \cdot 1 & 11 \cdot 0 & 11 \cdot 0\\11 \cdot 0 & 11 \cdot 1 & 11 \cdot 0\\11 \cdot 0 & 11 \cdot 0 & 11 \cdot 1 \end{bmatrix} = \begin{bmatrix}11&0&0\\0&11&0\\0&0&11 \end{bmatrix}$ ... (iii)
From (i), (ii), and (iii), we see that $A (\text{adj } A) = \begin{bmatrix} 11 & 0 & 0 \\ 0 & 11 & 0 \\ 0 & 0 & 11 \end{bmatrix}$, $(\text{adj } A) A = \begin{bmatrix} 11 & 0 & 0 \\ 0 & 11 & 0 \\ 0 & 0 & 11 \end{bmatrix}$, and $|A| I = \begin{bmatrix} 11 & 0 & 0 \\ 0 & 11 & 0 \\ 0 & 0 & 11 \end{bmatrix}$.
Therefore, $A (\text{adj } A) = (\text{adj } A) A = |A| I$ is verified.
Find the inverse of each of the matrices (if it exists) given in Exercises 5 to 11.
Question 5. $\begin{bmatrix}2&−2\\4&3 \end{bmatrix}$
Answer:
Given:
Let the given matrix be A.
$A = \begin{bmatrix}2&-2\\4&3 \end{bmatrix}$
To Find:
The inverse of matrix A ($A^{-1}$), if it exists.
Solution:
First, we calculate the determinant of A to check if the inverse exists.
$|A| = \det(A) = (2)(3) - (-2)(4)$
$|A| = 6 - (-8)$
$|A| = 6 + 8 = 14$
Since $|A| = 14 \neq 0$, the matrix A is invertible.
Next, we find the adjoint of A. For a $2 \times 2$ matrix $\begin{bmatrix}a&b\\c&d \end{bmatrix}$, the adjoint is $\begin{bmatrix}d&-b\\-c&a \end{bmatrix}$.
$A = \begin{bmatrix}2&-2\\4&3 \end{bmatrix}$
$\text{adj } A = \begin{bmatrix}3&-(-2)\\-4&2 \end{bmatrix} = \begin{bmatrix}3&2\\-4&2 \end{bmatrix}$
The inverse of a matrix A is given by the formula $A^{-1} = \frac{1}{|A|} \text{ adj } A$.
Substituting the determinant and the adjoint matrix:
$A^{-1} = \frac{1}{14} \begin{bmatrix}3&2\\-4&2 \end{bmatrix}$
Multiplying the scalar with the matrix:
$A^{-1} = \begin{bmatrix} \frac{3}{14} & \frac{2}{14} \\ \frac{-4}{14} & \frac{2}{14} \end{bmatrix}$
Simplifying the fractions:
$A^{-1} = \begin{bmatrix} \frac{3}{14} & \frac{1}{7} \\ -\frac{2}{7} & \frac{1}{7} \end{bmatrix}$
Question 6. $\begin{bmatrix}−1&5\\−3&2 \end{bmatrix}$
Answer:
Given:
Let the given matrix be A.
$A = \begin{bmatrix}−1&5\\−3&2 \end{bmatrix}$
To Find:
The inverse of matrix A ($A^{-1}$), if it exists.
Solution:
First, we calculate the determinant of A to check if the inverse exists.
$|A| = \det(A) = (-1)(2) - (5)(-3)$
$|A| = -2 - (-15)$
$|A| = -2 + 15 = 13$
Since $|A| = 13 \neq 0$, the matrix A is invertible.
Next, we find the adjoint of A. For a $2 \times 2$ matrix $\begin{bmatrix}a&b\\c&d \end{bmatrix}$, the adjoint is $\begin{bmatrix}d&-b\\-c&a \end{bmatrix}$.
$A = \begin{bmatrix}−1&5\\−3&2 \end{bmatrix}$
$\text{adj } A = \begin{bmatrix}2&-5\\-(-3)&-1 \end{bmatrix} = \begin{bmatrix}2&-5\\3&-1 \end{bmatrix}$
The inverse of a matrix A is given by the formula $A^{-1} = \frac{1}{|A|} \text{ adj } A$.
Substituting the determinant and the adjoint matrix:
$A^{-1} = \frac{1}{13} \begin{bmatrix}2&-5\\3&-1 \end{bmatrix}$
Multiplying the scalar with the matrix:
$A^{-1} = \begin{bmatrix} \frac{2}{13} & \frac{-5}{13} \\ \frac{3}{13} & \frac{-1}{13} \end{bmatrix}$
Question 7. $\begin{bmatrix}1&2&3\\0&2&4\\0&0&5 \end{bmatrix}$
Answer:
Given:
Let the given matrix be A.
$A = \begin{bmatrix}1&2&3\\0&2&4\\0&0&5 \end{bmatrix}$
To Find:
The inverse of matrix A ($A^{-1}$), if it exists.
Solution:
First, we calculate the determinant of A to check if the inverse exists.
Since A is an upper triangular matrix, its determinant is the product of its diagonal elements.
$|A| = \det(A) = (1)(2)(5)$
$|A| = 10$
Since $|A| = 10 \neq 0$, the matrix A is invertible.
Next, we find the adjoint of A by finding the transpose of the matrix of cofactors.
We calculate the cofactors $C_{ij} = (-1)^{i+j} M_{ij}$, where $M_{ij}$ is the minor of the element $a_{ij}$.
$C_{11} = (-1)^{1+1} \begin{vmatrix} 2 & 4 \\ 0 & 5 \end{vmatrix} = (1)((2)(5) - (4)(0)) = 10 - 0 = 10$
$C_{12} = (-1)^{1+2} \begin{vmatrix} 0 & 4 \\ 0 & 5 \end{vmatrix} = (-1)((0)(5) - (4)(0)) = -(0 - 0) = 0$
$C_{13} = (-1)^{1+3} \begin{vmatrix} 0 & 2 \\ 0 & 0 \end{vmatrix} = (1)((0)(0) - (2)(0)) = 0 - 0 = 0$
$C_{21} = (-1)^{2+1} \begin{vmatrix} 2 & 3 \\ 0 & 5 \end{vmatrix} = (-1)((2)(5) - (3)(0)) = -(10 - 0) = -10$
$C_{22} = (-1)^{2+2} \begin{vmatrix} 1 & 3 \\ 0 & 5 \end{vmatrix} = (1)((1)(5) - (3)(0)) = 5 - 0 = 5$
$C_{23} = (-1)^{2+3} \begin{vmatrix} 1 & 2 \\ 0 & 0 \end{vmatrix} = (-1)((1)(0) - (2)(0)) = -(0 - 0) = 0$
$C_{31} = (-1)^{3+1} \begin{vmatrix} 2 & 3 \\ 2 & 4 \end{vmatrix} = (1)((2)(4) - (3)(2)) = 8 - 6 = 2$
$C_{32} = (-1)^{3+2} \begin{vmatrix} 1 & 3 \\ 0 & 4 \end{vmatrix} = (-1)((1)(4) - (3)(0)) = -(4 - 0) = -4$
$C_{33} = (-1)^{3+3} \begin{vmatrix} 1 & 2 \\ 0 & 2 \end{vmatrix} = (1)((1)(2) - (2)(0)) = 2 - 0 = 2$
The matrix of cofactors is:
$\text{Cofactor}(A) = \begin{bmatrix} 10 & 0 & 0 \\ -10 & 5 & 0 \\ 2 & -4 & 2 \end{bmatrix}$
The adjoint of A is the transpose of the matrix of cofactors:
$\text{adj } A = (\text{Cofactor}(A))^T = \begin{bmatrix} 10 & -10 & 2 \\ 0 & 5 & -4 \\ 0 & 0 & 2 \end{bmatrix}$ ... (i)
The inverse of a matrix A is given by the formula $A^{-1} = \frac{1}{|A|} \text{ adj } A$.
Substituting the determinant $|A| = 10$ and the adjoint matrix from (i):
$A^{-1} = \frac{1}{10} \begin{bmatrix} 10 & -10 & 2 \\ 0 & 5 & -4 \\ 0 & 0 & 2 \end{bmatrix}$
Multiplying the scalar $\frac{1}{10}$ with each element of the matrix:
$A^{-1} = \begin{bmatrix} \frac{10}{10} & \frac{-10}{10} & \frac{2}{10} \\ \frac{0}{10} & \frac{5}{10} & \frac{-4}{10} \\ \frac{0}{10} & \frac{0}{10} & \frac{2}{10} \end{bmatrix}$
Simplifying the fractions:
$A^{-1} = \begin{bmatrix} 1 & -1 & \frac{1}{5} \\ 0 & \frac{1}{2} & -\frac{2}{5} \\ 0 & 0 & \frac{1}{5} \end{bmatrix}$ ... (ii)
Question 8. $\begin{bmatrix}1&0&0\\3&3&0\\5&2&−1 \end{bmatrix}$
Answer:
Given:
Let the given matrix be A.
$A = \begin{bmatrix}1&0&0\\3&3&0\\5&2&−1 \end{bmatrix}$
To Find:
The inverse of matrix A ($A^{-1}$), if it exists.
Solution:
First, we calculate the determinant of A to check if the inverse exists.
Since A is a lower triangular matrix, its determinant is the product of its diagonal elements.
$|A| = \det(A) = (1)(3)(-1)$
$|A| = -3$
Since $|A| = -3 \neq 0$, the matrix A is invertible.
Next, we find the adjoint of A by finding the transpose of the matrix of cofactors.
We calculate the cofactors $C_{ij} = (-1)^{i+j} M_{ij}$, where $M_{ij}$ is the minor of the element $a_{ij}$.
$C_{11} = (-1)^{1+1} \begin{vmatrix} 3 & 0 \\ 2 & -1 \end{vmatrix} = (1)((3)(-1) - (0)(2)) = -3 - 0 = -3$
$C_{12} = (-1)^{1+2} \begin{vmatrix} 3 & 0 \\ 5 & -1 \end{vmatrix} = (-1)((3)(-1) - (0)(5)) = (-1)(-3 - 0) = 3$
$C_{13} = (-1)^{1+3} \begin{vmatrix} 3 & 3 \\ 5 & 2 \end{vmatrix} = (1)((3)(2) - (3)(5)) = 6 - 15 = -9$
$C_{21} = (-1)^{2+1} \begin{vmatrix} 0 & 0 \\ 2 & -1 \end{vmatrix} = (-1)((0)(-1) - (0)(2)) = (-1)(0 - 0) = 0$
$C_{22} = (-1)^{2+2} \begin{vmatrix} 1 & 0 \\ 5 & -1 \end{vmatrix} = (1)((1)(-1) - (0)(5)) = -1 - 0 = -1$
$C_{23} = (-1)^{2+3} \begin{vmatrix} 1 & 0 \\ 5 & 2 \end{vmatrix} = (-1)((1)(2) - (0)(5)) = (-1)(2 - 0) = -2$
$C_{31} = (-1)^{3+1} \begin{vmatrix} 0 & 0 \\ 3 & 0 \end{vmatrix} = (1)((0)(0) - (0)(3)) = 0 - 0 = 0$
$C_{32} = (-1)^{3+2} \begin{vmatrix} 1 & 0 \\ 3 & 0 \end{vmatrix} = (-1)((1)(0) - (0)(3)) = (-1)(0 - 0) = 0$
$C_{33} = (-1)^{3+3} \begin{vmatrix} 1 & 0 \\ 3 & 3 \end{vmatrix} = (1)((1)(3) - (0)(3)) = 3 - 0 = 3$
The matrix of cofactors is:
$\text{Cofactor}(A) = \begin{bmatrix} -3 & 3 & -9 \\ 0 & -1 & -2 \\ 0 & 0 & 3 \end{bmatrix}$
The adjoint of A is the transpose of the matrix of cofactors:
$\text{adj } A = (\text{Cofactor}(A))^T = \begin{bmatrix} -3 & 0 & 0 \\ 3 & -1 & 0 \\ -9 & -2 & 3 \end{bmatrix}$ ... (i)
The inverse of a matrix A is given by the formula $A^{-1} = \frac{1}{|A|} \text{ adj } A$.
Substituting the determinant $|A| = -3$ and the adjoint matrix from (i):
$A^{-1} = \frac{1}{-3} \begin{bmatrix} -3 & 0 & 0 \\ 3 & -1 & 0 \\ -9 & -2 & 3 \end{bmatrix}$
Multiplying the scalar $\frac{1}{-3}$ with each element of the matrix:
$A^{-1} = \begin{bmatrix} \frac{-3}{-3} & \frac{0}{-3} & \frac{0}{-3} \\ \frac{3}{-3} & \frac{-1}{-3} & \frac{0}{-3} \\ \frac{-9}{-3} & \frac{-2}{-3} & \frac{3}{-3} \end{bmatrix}$
Simplifying the fractions:
$A^{-1} = \begin{bmatrix} 1 & 0 & 0 \\ -1 & \frac{1}{3} & 0 \\ 3 & \frac{2}{3} & -1 \end{bmatrix}$ ... (ii)
Question 9. $\begin{bmatrix}2&1&3\\4&−1&0\\−7&2&1 \end{bmatrix}$
Answer:
Given:
Let the given matrix be A.
$A = \begin{bmatrix}2&1&3\\4&−1&0\\−7&2&1 \end{bmatrix}$
To Find:
The inverse of matrix A ($A^{-1}$), if it exists.
Solution:
First, we calculate the determinant of A to check if the inverse exists.
$|A| = \det(A) = \begin{vmatrix}2&1&3\\4&−1&0\\−7&2&1 \end{vmatrix}$
Expanding along the third column (since it has a zero):
$|A| = 3 \cdot C_{13} + 0 \cdot C_{23} + 1 \cdot C_{33}$
$C_{13} = (-1)^{1+3} \begin{vmatrix} 4 & -1 \\ -7 & 2 \end{vmatrix} = (1)((4)(2) - (-1)(-7)) = 8 - 7 = 1$
$C_{23} = (-1)^{2+3} \begin{vmatrix} 2 & 1 \\ -7 & 2 \end{vmatrix} = (-1)((2)(2) - (1)(-7)) = (-1)(4 + 7) = -11$
$C_{33} = (-1)^{3+3} \begin{vmatrix} 2 & 1 \\ 4 & -1 \end{vmatrix} = (1)((2)(-1) - (1)(4)) = -2 - 4 = -6$
$|A| = 3(1) + 0(-11) + 1(-6) = 3 + 0 - 6 = -3$
Since $|A| = -3 \neq 0$, the matrix A is invertible.
Next, we find the adjoint of A by finding the transpose of the matrix of cofactors.
$C_{11} = (-1)^{1+1} \begin{vmatrix} -1 & 0 \\ 2 & 1 \end{vmatrix} = (1)((-1)(1) - (0)(2)) = -1 - 0 = -1$
$C_{12} = (-1)^{1+2} \begin{vmatrix} 4 & 0 \\ -7 & 1 \end{vmatrix} = (-1)((4)(1) - (0)(-7)) = (-1)(4 - 0) = -4$
$C_{13} = (-1)^{1+3} \begin{vmatrix} 4 & -1 \\ -7 & 2 \end{vmatrix} = (1)((4)(2) - (-1)(-7)) = 8 - 7 = 1$
$C_{21} = (-1)^{2+1} \begin{vmatrix} 1 & 3 \\ 2 & 1 \end{vmatrix} = (-1)((1)(1) - (3)(2)) = (-1)(1 - 6) = 5$
$C_{22} = (-1)^{2+2} \begin{vmatrix} 2 & 3 \\ -7 & 1 \end{vmatrix} = (1)((2)(1) - (3)(-7)) = 2 + 21 = 23$
$C_{23} = (-1)^{2+3} \begin{vmatrix} 2 & 1 \\ -7 & 2 \end{vmatrix} = (-1)((2)(2) - (1)(-7)) = (-1)(4 + 7) = -11$
$C_{31} = (-1)^{3+1} \begin{vmatrix} 1 & 3 \\ -1 & 0 \end{vmatrix} = (1)((1)(0) - (3)(-1)) = 0 + 3 = 3$
$C_{32} = (-1)^{3+2} \begin{vmatrix} 2 & 3 \\ 4 & 0 \end{vmatrix} = (-1)((2)(0) - (3)(4)) = (-1)(0 - 12) = 12$
$C_{33} = (-1)^{3+3} \begin{vmatrix} 2 & 1 \\ 4 & -1 \end{vmatrix} = (1)((2)(-1) - (1)(4)) = -2 - 4 = -6$
The matrix of cofactors is:
$\text{Cofactor}(A) = \begin{bmatrix} -1 & -4 & 1 \\ 5 & 23 & -11 \\ 3 & 12 & -6 \end{bmatrix}$
The adjoint of A is the transpose of the matrix of cofactors:
$\text{adj } A = (\text{Cofactor}(A))^T = \begin{bmatrix} -1 & 5 & 3 \\ -4 & 23 & 12 \\ 1 & -11 & -6 \end{bmatrix}$ ... (i)
The inverse of a matrix A is given by the formula $A^{-1} = \frac{1}{|A|} \text{ adj } A$.
Substituting the determinant $|A| = -3$ and the adjoint matrix from (i):
$A^{-1} = \frac{1}{-3} \begin{bmatrix} -1 & 5 & 3 \\ -4 & 23 & 12 \\ 1 & -11 & -6 \end{bmatrix}$
Multiplying the scalar $\frac{1}{-3}$ with each element of the matrix:
$A^{-1} = \begin{bmatrix} \frac{-1}{-3} & \frac{5}{-3} & \frac{3}{-3} \\ \frac{-4}{-3} & \frac{23}{-3} & \frac{12}{-3} \\ \frac{1}{-3} & \frac{-11}{-3} & \frac{-6}{-3} \end{bmatrix}$
Simplifying the fractions:
$A^{-1} = \begin{bmatrix} \frac{1}{3} & -\frac{5}{3} & -1 \\ \frac{4}{3} & -\frac{23}{3} & -4 \\ -\frac{1}{3} & \frac{11}{3} & 2 \end{bmatrix}$ ... (ii)
Question 10. $\begin{bmatrix}1&−1&2\\0&2&−3\\3&−2&4 \end{bmatrix}$
Answer:
Given:
Let the given matrix be A.
$A = \begin{bmatrix}1&−1&2\\0&2&−3\\3&−2&4 \end{bmatrix}$
To Find:
The inverse of matrix A ($A^{-1}$), if it exists.
Solution:
First, we calculate the determinant of A to check if the inverse exists.
$|A| = \det(A) = \begin{vmatrix}1&−1&2\\0&2&−3\\3&−2&4 \end{vmatrix}$
Expanding along the first column:
$|A| = 1 \cdot \begin{vmatrix} 2 & -3 \\ -2 & 4 \end{vmatrix} - 0 \cdot \begin{vmatrix} -1 & 2 \\ -2 & 4 \end{vmatrix} + 3 \cdot \begin{vmatrix} -1 & 2 \\ 2 & -3 \end{vmatrix}$
$|A| = 1((2)(4) - (-3)(-2)) - 0 + 3((-1)(-3) - (2)(2))$
$|A| = 1(8 - 6) + 3(3 - 4)$
$|A| = 1(2) + 3(-1)$
$|A| = 2 - 3 = -1$
Since $|A| = -1 \neq 0$, the matrix A is invertible.
Next, we find the adjoint of A by finding the transpose of the matrix of cofactors.
We calculate the cofactors $C_{ij} = (-1)^{i+j} M_{ij}$, where $M_{ij}$ is the minor of the element $a_{ij}$.
$C_{11} = (-1)^{1+1} \begin{vmatrix} 2 & -3 \\ -2 & 4 \end{vmatrix} = (1)((2)(4) - (-3)(-2)) = 8 - 6 = 2$
$C_{12} = (-1)^{1+2} \begin{vmatrix} 0 & -3 \\ 3 & 4 \end{vmatrix} = (-1)((0)(4) - (-3)(3)) = (-1)(0 + 9) = -9$
$C_{13} = (-1)^{1+3} \begin{vmatrix} 0 & 2 \\ 3 & -2 \end{vmatrix} = (1)((0)(-2) - (2)(3)) = 0 - 6 = -6$
$C_{21} = (-1)^{2+1} \begin{vmatrix} -1 & 2 \\ -2 & 4 \end{vmatrix} = (-1)((-1)(4) - (2)(-2)) = (-1)(-4 + 4) = 0$
$C_{22} = (-1)^{2+2} \begin{vmatrix} 1 & 2 \\ 3 & 4 \end{vmatrix} = (1)((1)(4) - (2)(3)) = 4 - 6 = -2$
$C_{23} = (-1)^{2+3} \begin{vmatrix} 1 & -1 \\ 3 & -2 \end{vmatrix} = (-1)((1)(-2) - (-1)(3)) = (-1)(-2 + 3) = -1$
$C_{31} = (-1)^{3+1} \begin{vmatrix} -1 & 2 \\ 2 & -3 \end{vmatrix} = (1)((-1)(-3) - (2)(2)) = 3 - 4 = -1$
$C_{32} = (-1)^{3+2} \begin{vmatrix} 1 & 2 \\ 0 & -3 \end{vmatrix} = (-1)((1)(-3) - (2)(0)) = (-1)(-3 - 0) = 3$
$C_{33} = (-1)^{3+3} \begin{vmatrix} 1 & -1 \\ 0 & 2 \end{vmatrix} = (1)((1)(2) - (-1)(0)) = 2 - 0 = 2$
The matrix of cofactors is:
$\text{Cofactor}(A) = \begin{bmatrix} 2 & -9 & -6 \\ 0 & -2 & -1 \\ -1 & 3 & 2 \end{bmatrix}$
The adjoint of A is the transpose of the matrix of cofactors:
$\text{adj } A = (\text{Cofactor}(A))^T = \begin{bmatrix} 2 & 0 & -1 \\ -9 & -2 & 3 \\ -6 & -1 & 2 \end{bmatrix}$ ... (i)
The inverse of a matrix A is given by the formula $A^{-1} = \frac{1}{|A|} \text{ adj } A$.
Substituting the determinant $|A| = -1$ and the adjoint matrix from (i):
$A^{-1} = \frac{1}{-1} \begin{bmatrix} 2 & 0 & -1 \\ -9 & -2 & 3 \\ -6 & -1 & 2 \end{bmatrix}$
Multiplying the scalar $\frac{1}{-1} = -1$ with each element of the matrix:
$A^{-1} = -1 \begin{bmatrix} 2 & 0 & -1 \\ -9 & -2 & 3 \\ -6 & -1 & 2 \end{bmatrix} = \begin{bmatrix} -2 & 0 & 1 \\ 9 & 2 & -3 \\ 6 & 1 & -2 \end{bmatrix}$ ... (ii)
Question 11. $\begin{bmatrix}(1&0&0\\0&\cosα&\sinα\\0&\sinα&−\cosα) \end{bmatrix}$
Answer:
Given:
The matrix $A = \begin{bmatrix}1&0&0\\0&\cosα&\sinα\\0&\sinα&−\cosα\end{bmatrix}$
Solution:
We need to determine a property of the given matrix. Let's verify if it is an orthogonal matrix.
A square matrix $A$ is defined as orthogonal if the product of the matrix and its transpose is equal to the identity matrix, i.e., $A A^T = I$.
First, we find the transpose of the matrix $A$. The transpose $A^T$ is obtained by interchanging the rows and columns of $A$.
$A^T = \begin{bmatrix}1&0&0\\0&\cosα&\sinα\\0&\sinα&−\cosα\end{bmatrix}^T = \begin{bmatrix}1&0&0\\0&\cosα&\sinα\\0&\sinα&−\cosα\end{bmatrix}$
Now, we calculate the product of the matrix $A$ and its transpose $A^T$.
$A A^T = \begin{bmatrix}1&0&0\\0&\cosα&\sinα\\0&\sinα&−\cosα\end{bmatrix} \begin{bmatrix}1&0&0\\0&\cosα&\sinα\\0&\sinα&−\cosα\end{bmatrix}$
Performing the matrix multiplication, we get:
$A A^T = \begin{bmatrix} 1 \cdot 1 + 0 \cdot 0 + 0 \cdot 0 & 1 \cdot 0 + 0 \cdot \cosα + 0 \cdot \sinα & 1 \cdot 0 + 0 \cdot \sinα + 0 \cdot (-\cosα) \\ 0 \cdot 1 + \cosα \cdot 0 + \sinα \cdot 0 & 0 \cdot 0 + \cosα \cdot \cosα + \sinα \cdot \sinα & 0 \cdot 0 + \cosα \cdot \sinα + \sinα \cdot (-\cosα) \\ 0 \cdot 1 + \sinα \cdot 0 + (-\cosα) \cdot 0 & 0 \cdot 0 + \sinα \cdot \cosα + (-\cosα) \cdot \sinα & 0 \cdot 0 + \sinα \cdot \sinα + (-\cosα) \cdot (-\cosα) \end{bmatrix}$
$A A^T = \begin{bmatrix} 1 & 0 & 0 \\ 0 & \cos^2α + \sin^2α & \cosα \sinα - \sinα \cosα \\ 0 & \sinα \cosα - \cosα \sinα & \sin^2α + \cos^2α \end{bmatrix}$
Using the fundamental trigonometric identity $\cos^2α + \sin^2α = 1$, the matrix simplifies to:
$A A^T = \begin{bmatrix} 1 & 0 & 0 \\ 0 & 1 & 0 \\ 0 & 0 & 1 \end{bmatrix}$
This result is the identity matrix $I$ of order 3.
Since $A A^T = I$, the given matrix satisfies the condition for being an orthogonal matrix.
Conclusion:
The given matrix is an orthogonal matrix.
Question 12. Let A = $\begin{bmatrix}3&7\\2&5 \end{bmatrix}$ and B = $\begin{bmatrix}6&8\\7&9 \end{bmatrix}$ . Verify that (AB)-1 = B-1A-1.
Answer:
Given:
Matrix $A = \begin{bmatrix}3&7\\2&5 \end{bmatrix}$
Matrix $B = \begin{bmatrix}6&8\\7&9 \end{bmatrix}$
To Verify:
$(AB)^{-1} = B^{-1}A^{-1}$
Solution:
To verify the property $(AB)^{-1} = B^{-1}A^{-1}$, we need to calculate the inverse of matrix $A$, the inverse of matrix $B$, the product $AB$, the inverse of $AB$, and finally the product $B^{-1}A^{-1}$, and then compare the results.
For a $2 \times 2$ matrix $M = \begin{bmatrix}a&b\\c&d\end{bmatrix}$, the inverse $M^{-1}$ is given by $M^{-1} = \frac{1}{\det(M)} \begin{bmatrix}d&-b\\-c&a\end{bmatrix}$, where the determinant $\det(M) = ad - bc$. The inverse exists if and only if $\det(M) \neq 0$.
Step 1: Find $A^{-1}$
First, calculate the determinant of $A$:
$\det(A) = (3)(5) - (7)(2) = 15 - 14 = 1$
Since $\det(A) = 1 \neq 0$, the inverse $A^{-1}$ exists.
Now, find the adjoint of $A$ (by swapping the diagonal elements and negating the off-diagonal elements):
$\text{adj}(A) = \begin{bmatrix}5&-7\\-2&3\end{bmatrix}$
Calculate $A^{-1}$:
$A^{-1} = \frac{1}{\det(A)} \text{adj}(A) = \frac{1}{1} \begin{bmatrix}5&-7\\-2&3\end{bmatrix} = \begin{bmatrix}5&-7\\-2&3\end{bmatrix}$
So, $A^{-1} = \begin{bmatrix}5&-7\\-2&3\end{bmatrix}$.
Step 2: Find $B^{-1}$
First, calculate the determinant of $B$:
$\det(B) = (6)(9) - (8)(7) = 54 - 56 = -2$
Since $\det(B) = -2 \neq 0$, the inverse $B^{-1}$ exists.
Now, find the adjoint of $B$:
$\text{adj}(B) = \begin{bmatrix}9&-8\\-7&6\end{bmatrix}$
Calculate $B^{-1}$:
$B^{-1} = \frac{1}{\det(B)} \text{adj}(B) = \frac{1}{-2} \begin{bmatrix}9&-8\\-7&6\end{bmatrix} = \begin{bmatrix}\frac{9}{-2}&\frac{-8}{-2}\\\frac{-7}{-2}&\frac{6}{-2}\end{bmatrix} = \begin{bmatrix}-\frac{9}{2}&4\\\frac{7}{2}&-3\end{bmatrix}$
So, $B^{-1} = \begin{bmatrix}-\frac{9}{2}&4\\\frac{7}{2}&-3\end{bmatrix}$.
Step 3: Find $AB$
Multiply matrix $A$ by matrix $B$:
$AB = \begin{bmatrix}3&7\\2&5 \end{bmatrix} \begin{bmatrix}6&8\\7&9 \end{bmatrix}$
$AB = \begin{bmatrix}(3)(6) + (7)(7) & (3)(8) + (7)(9)\\(2)(6) + (5)(7) & (2)(8) + (5)(9)\end{bmatrix}$
$AB = \begin{bmatrix}18 + 49 & 24 + 63\\12 + 35 & 16 + 45\end{bmatrix}$
$AB = \begin{bmatrix}67&87\\47&61\end{bmatrix}$
So, $AB = \begin{bmatrix}67&87\\47&61\end{bmatrix}$.
Step 4: Find $(AB)^{-1}$
First, calculate the determinant of $AB$:
$\det(AB) = (67)(61) - (87)(47) = 4087 - 4089 = -2$
Since $\det(AB) = -2 \neq 0$, the inverse $(AB)^{-1}$ exists.
Now, find the adjoint of $AB$:
$\text{adj}(AB) = \begin{bmatrix}61&-87\\-47&67\end{bmatrix}$
Calculate $(AB)^{-1}$:
$(AB)^{-1} = \frac{1}{\det(AB)} \text{adj}(AB) = \frac{1}{-2} \begin{bmatrix}61&-87\\-47&67\end{bmatrix} = \begin{bmatrix}\frac{61}{-2}&\frac{-87}{-2}\\\frac{-47}{-2}&\frac{67}{-2}\end{bmatrix} = \begin{bmatrix}-\frac{61}{2}&\frac{87}{2}\\\frac{47}{2}&-\frac{67}{2}\end{bmatrix}$
So, $(AB)^{-1} = \begin{bmatrix}-\frac{61}{2}&\frac{87}{2}\\\frac{47}{2}&-\frac{67}{2}\end{bmatrix}$.
Step 5: Find $B^{-1}A^{-1}$
Multiply matrix $B^{-1}$ by matrix $A^{-1}$:
$B^{-1}A^{-1} = \begin{bmatrix}-\frac{9}{2}&4\\\frac{7}{2}&-3\end{bmatrix} \begin{bmatrix}5&-7\\-2&3\end{bmatrix}$
$B^{-1}A^{-1} = \begin{bmatrix}(-\frac{9}{2})(5) + (4)(-2) & (-\frac{9}{2})(-7) + (4)(3)\\( \frac{7}{2})(5) + (-3)(-2) & ( \frac{7}{2})(-7) + (-3)(3)\end{bmatrix}$
$B^{-1}A^{-1} = \begin{bmatrix}-\frac{45}{2} - 8 & \frac{63}{2} + 12\\ \frac{35}{2} + 6 & -\frac{49}{2} - 9\end{bmatrix}$
To add/subtract the fractions, find common denominators:
$B^{-1}A^{-1} = \begin{bmatrix}-\frac{45}{2} - \frac{16}{2} & \frac{63}{2} + \frac{24}{2}\\ \frac{35}{2} + \frac{12}{2} & -\frac{49}{2} - \frac{18}{2}\end{bmatrix}$
$B^{-1}A^{-1} = \begin{bmatrix}\frac{-45 - 16}{2} & \frac{63 + 24}{2}\\ \frac{35 + 12}{2} & \frac{-49 - 18}{2}\end{bmatrix}$
$B^{-1}A^{-1} = \begin{bmatrix}-\frac{61}{2}&\frac{87}{2}\\\frac{47}{2}&-\frac{67}{2}\end{bmatrix}$
So, $B^{-1}A^{-1} = \begin{bmatrix}-\frac{61}{2}&\frac{87}{2}\\\frac{47}{2}&-\frac{67}{2}\end{bmatrix}$.
Step 6: Comparison
From Step 4, we have $(AB)^{-1} = \begin{bmatrix}-\frac{61}{2}&\frac{87}{2}\\\frac{47}{2}&-\frac{67}{2}\end{bmatrix}$.
From Step 5, we have $B^{-1}A^{-1} = \begin{bmatrix}-\frac{61}{2}&\frac{87}{2}\\\frac{47}{2}&-\frac{67}{2}\end{bmatrix}$.
Comparing the two results, we see that the matrices are identical.
Conclusion:
Since $(AB)^{-1}$ and $B^{-1}A^{-1}$ are equal,
$\begin{bmatrix}-\frac{61}{2}&\frac{87}{2}\\\frac{47}{2}&-\frac{67}{2}\end{bmatrix} = \begin{bmatrix}-\frac{61}{2}&\frac{87}{2}\\\frac{47}{2}&-\frac{67}{2}\end{bmatrix}$
The property $(AB)^{-1} = B^{-1}A^{-1}$ is successfully verified for the given matrices $A$ and $B$.
Question 13. If A = $\begin{bmatrix}3&1\\−1&2 \end{bmatrix}$ , show that A2 – 5A + 7I = O. Hence find A–1.
Answer:
Given:
Matrix $A = \begin{bmatrix}3&1\\-1&2 \end{bmatrix}$
$I$ is the identity matrix of order $2 \times 2$, i.e., $I = \begin{bmatrix}1&0\\0&1 \end{bmatrix}$.
$O$ is the zero matrix of order $2 \times 2$, i.e., $O = \begin{bmatrix}0&0\\0&0 \end{bmatrix}$.
To Show:
$A^2 - 5A + 7I = O$
To Find:
$A^{-1}$ using the given equation.
Solution:
First, we calculate $A^2$:
$A^2 = A \times A = \begin{bmatrix}3&1\\-1&2 \end{bmatrix} \begin{bmatrix}3&1\\-1&2 \end{bmatrix}$
$A^2 = \begin{bmatrix}(3)(3) + (1)(-1) & (3)(1) + (1)(2)\\(-1)(3) + (2)(-1) & (-1)(1) + (2)(2)\end{bmatrix}$
$A^2 = \begin{bmatrix}9 - 1 & 3 + 2\\-3 - 2 & -1 + 4\end{bmatrix}$
$A^2 = \begin{bmatrix}8&5\\-5&3\end{bmatrix}$
Next, we calculate $5A$:
$5A = 5 \begin{bmatrix}3&1\\-1&2 \end{bmatrix} = \begin{bmatrix}5 \times 3 & 5 \times 1\\5 \times -1 & 5 \times 2 \end{bmatrix}$
$5A = \begin{bmatrix}15&5\\-5&10\end{bmatrix}$
Now, we calculate $7I$:
$7I = 7 \begin{bmatrix}1&0\\0&1 \end{bmatrix} = \begin{bmatrix}7 \times 1 & 7 \times 0\\7 \times 0 & 7 \times 1 \end{bmatrix}$
$7I = \begin{bmatrix}7&0\\0&7\end{bmatrix}$
Now, we substitute these results into the expression $A^2 - 5A + 7I$:
$A^2 - 5A + 7I = \begin{bmatrix}8&5\\-5&3\end{bmatrix} - \begin{bmatrix}15&5\\-5&10\end{bmatrix} + \begin{bmatrix}7&0\\0&7\end{bmatrix}$
$A^2 - 5A + 7I = \begin{bmatrix}8 - 15&5 - 5\\-5 - (-5)&3 - 10\end{bmatrix} + \begin{bmatrix}7&0\\0&7\end{bmatrix}$
$A^2 - 5A + 7I = \begin{bmatrix}-7&0\\-5 + 5&-7\end{bmatrix} + \begin{bmatrix}7&0\\0&7\end{bmatrix}$
$A^2 - 5A + 7I = \begin{bmatrix}-7&0\\0&-7\end{bmatrix} + \begin{bmatrix}7&0\\0&7\end{bmatrix}$
$A^2 - 5A + 7I = \begin{bmatrix}-7 + 7&0 + 0\\0 + 0&-7 + 7\end{bmatrix}$
$A^2 - 5A + 7I = \begin{bmatrix}0&0\\0&0\end{bmatrix}$
This is the zero matrix $O$.
Thus, we have shown that $A^2 - 5A + 7I = O$.
Finding $A^{-1}$ using the equation:
We have the equation:
$A^2 - 5A + 7I = O$
To find $A^{-1}$, we can multiply the entire equation by $A^{-1}$ from the left. Note that since the determinant of A is $(3)(2) - (1)(-1) = 6 + 1 = 7 \neq 0$, $A^{-1}$ exists.
$A^{-1}(A^2 - 5A + 7I) = A^{-1}O$
Using the distributive property of matrix multiplication:
$A^{-1}A^2 - A^{-1}(5A) + A^{-1}(7I) = O$
Rearranging terms and using properties $A^{-1}A = I$, $A^{-1}IA = A^{-1}A = I$, $A^{-1}I = A^{-1}$ and $IX = X$:
$(A^{-1}A)A - 5(A^{-1}A) + 7(A^{-1}I) = O$
$IA - 5I + 7A^{-1} = O$
$A - 5I + 7A^{-1} = O$
Now, we solve for $A^{-1}$:
$7A^{-1} = 5I - A$
$A^{-1} = \frac{1}{7}(5I - A)$
Now, we calculate the matrix $5I - A$:
$5I - A = 5\begin{bmatrix}1&0\\0&1 \end{bmatrix} - \begin{bmatrix}3&1\\-1&2 \end{bmatrix}$
$5I - A = \begin{bmatrix}5&0\\0&5 \end{bmatrix} - \begin{bmatrix}3&1\\-1&2 \end{bmatrix}$
$5I - A = \begin{bmatrix}5 - 3&0 - 1\\0 - (-1)&5 - 2 \end{bmatrix}$
$5I - A = \begin{bmatrix}2&-1\\1&3 \end{bmatrix}$
Finally, we calculate $A^{-1}$:
$A^{-1} = \frac{1}{7} \begin{bmatrix}2&-1\\1&3 \end{bmatrix}$
$A^{-1} = \begin{bmatrix}\frac{2}{7}&-\frac{1}{7}\\\frac{1}{7}&\frac{3}{7} \end{bmatrix}$
Conclusion:
We have shown that $A^2 - 5A + 7I = O$.
Using this equation, the inverse of matrix A is found to be:
$A^{-1} = \begin{bmatrix}\frac{2}{7}&-\frac{1}{7}\\\frac{1}{7}&\frac{3}{7} \end{bmatrix}$
Question 14. For the matrix A = $\begin{bmatrix}3&2\\1&1 \end{bmatrix}$ , find the numbers a and b such that A2 + aA + bI = O.
Answer:
Given:
Matrix $A = \begin{bmatrix}3&2\\1&1 \end{bmatrix}$
The equation $A^2 + aA + bI = O$, where $I$ is the identity matrix and $O$ is the zero matrix.
To Find:
The values of the numbers $a$ and $b$.
Solution:
We are given the equation $A^2 + aA + bI = O$. We need to calculate each term on the left side and then equate the sum to the zero matrix to find $a$ and $b$.
The identity matrix $I$ of the same order as $A$ ($2 \times 2$) is $\begin{bmatrix}1&0\\0&1 \end{bmatrix}$.
The zero matrix $O$ of the same order ($2 \times 2$) is $\begin{bmatrix}0&0\\0&0 \end{bmatrix}$.
First, calculate $A^2$:
$A^2 = A \times A = \begin{bmatrix}3&2\\1&1 \end{bmatrix} \begin{bmatrix}3&2\\1&1 \end{bmatrix}$
$A^2 = \begin{bmatrix}(3)(3) + (2)(1) & (3)(2) + (2)(1)\\(1)(3) + (1)(1) & (1)(2) + (1)(1)\end{bmatrix}$
$A^2 = \begin{bmatrix}9 + 2 & 6 + 2\\3 + 1 & 2 + 1\end{bmatrix} = \begin{bmatrix}11&8\\4&3\end{bmatrix}$
Next, calculate $aA$:
$aA = a \begin{bmatrix}3&2\\1&1 \end{bmatrix} = \begin{bmatrix}a \times 3&a \times 2\\a \times 1&a \times 1\end{bmatrix} = \begin{bmatrix}3a&2a\\a&a\end{bmatrix}$
Next, calculate $bI$:
$bI = b \begin{bmatrix}1&0\\0&1 \end{bmatrix} = \begin{bmatrix}b \times 1&b \times 0\\b \times 0&b \times 1\end{bmatrix} = \begin{bmatrix}b&0\\0&b\end{bmatrix}$
Now, substitute these results into the given equation $A^2 + aA + bI = O$:
$\begin{bmatrix}11&8\\4&3\end{bmatrix} + \begin{bmatrix}3a&2a\\a&a\end{bmatrix} + \begin{bmatrix}b&0\\0&b\end{bmatrix} = \begin{bmatrix}0&0\\0&0\end{bmatrix}$
Perform the matrix addition on the left side:
$\begin{bmatrix}11 + 3a + b & 8 + 2a + 0\\4 + a + 0 & 3 + a + b\end{bmatrix} = \begin{bmatrix}0&0\\0&0\end{bmatrix}$
$\begin{bmatrix}11 + 3a + b & 8 + 2a\\4 + a & 3 + a + b\end{bmatrix} = \begin{bmatrix}0&0\\0&0\end{bmatrix}$
For two matrices to be equal, their corresponding elements must be equal. This gives us a system of linear equations:
$11 + 3a + b = 0 \quad$...(1)
$8 + 2a = 0 \quad$...(2)
$4 + a = 0 \quad$...(3)
$3 + a + b = 0 \quad$...(4)
From equation (2):
$2a = -8$
$a = \frac{-8}{2} = -4$
From equation (3):
$a = -4$
The value of $a$ is consistent from both equations.
Now substitute $a = -4$ into equation (4):
$3 + (-4) + b = 0$
$3 - 4 + b = 0$
$-1 + b = 0$
$b = 1$
Let's verify these values of $a$ and $b$ using equation (1):
$11 + 3a + b = 11 + 3(-4) + 1 = 11 - 12 + 1 = -1 + 1 = 0$.
The values $a=-4$ and $b=1$ satisfy all the equations.
Conclusion:
The numbers $a$ and $b$ such that $A^2 + aA + bI = O$ are $a = -4$ and $b = 1$.
Question 15. For the matrix A = $\begin{bmatrix}1&1&1\\1&2&−3\\2&−1&3 \end{bmatrix}$
Show that A3 – 6A2 + 5A + 11 I = O. Hence, find A–1.
Answer:
Given:
Matrix $A = \begin{bmatrix}1&1&1\\1&2&−3\\2&−1&3 \end{bmatrix}$
$I$ is the identity matrix of order $3 \times 3$, i.e., $I = \begin{bmatrix}1&0&0\\0&1&0\\0&0&1 \end{bmatrix}$.
$O$ is the zero matrix of order $3 \times 3$, i.e., $O = \begin{bmatrix}0&0&0\\0&0&0\\0&0&0 \end{bmatrix}$.
To Show:
$A^3 - 6A^2 + 5A + 11 I = O$
To Find:
$A^{-1}$ using the given equation.
Solution:
First, we calculate $A^2$:
$A^2 = A \times A = \begin{bmatrix}1&1&1\\1&2&−3\\2&−1&3 \end{bmatrix} \begin{bmatrix}1&1&1\\1&2&−3\\2&−1&3 \end{bmatrix}$
$A^2 = \begin{bmatrix} (1)(1) + (1)(1) + (1)(2) & (1)(1) + (1)(2) + (1)(-1) & (1)(1) + (1)(-3) + (1)(3) \\ (1)(1) + (2)(1) + (-3)(2) & (1)(1) + (2)(2) + (-3)(-1) & (1)(1) + (2)(-3) + (-3)(3) \\ (2)(1) + (-1)(1) + (3)(2) & (2)(1) + (-1)(2) + (3)(-1) & (2)(1) + (-1)(-3) + (3)(3) \end{bmatrix}$
$A^2 = \begin{bmatrix} 1 + 1 + 2 & 1 + 2 - 1 & 1 - 3 + 3 \\ 1 + 2 - 6 & 1 + 4 + 3 & 1 - 6 - 9 \\ 2 - 1 + 6 & 2 - 2 - 3 & 2 + 3 + 9 \end{bmatrix}$
$A^2 = \begin{bmatrix}4&2&1\\-3&8&-14\\7&-3&14\end{bmatrix}$
Next, we calculate $A^3$:
$A^3 = A^2 \times A = \begin{bmatrix}4&2&1\\-3&8&-14\\7&-3&14\end{bmatrix} \begin{bmatrix}1&1&1\\1&2&−3\\2&−1&3 \end{bmatrix}$
$A^3 = \begin{bmatrix} (4)(1) + (2)(1) + (1)(2) & (4)(1) + (2)(2) + (1)(-1) & (4)(1) + (2)(-3) + (1)(3) \\ (-3)(1) + (8)(1) + (-14)(2) & (-3)(1) + (8)(2) + (-14)(-1) & (-3)(1) + (8)(-3) + (-14)(3) \\ (7)(1) + (-3)(1) + (14)(2) & (7)(1) + (-3)(2) + (14)(-1) & (7)(1) + (-3)(-3) + (14)(3) \end{bmatrix}$
$A^3 = \begin{bmatrix} 4 + 2 + 2 & 4 + 4 - 1 & 4 - 6 + 3 \\ -3 + 8 - 28 & -3 + 16 + 14 & -3 - 24 - 42 \\ 7 - 3 + 28 & 7 - 6 - 14 & 7 + 9 + 42 \end{bmatrix}$
$A^3 = \begin{bmatrix}8&7&1\\-23&27&-69\\32&-13&58\end{bmatrix}$
Now, we calculate the terms $6A^2$, $5A$, and $11I$:
$6A^2 = 6 \begin{bmatrix}4&2&1\\-3&8&-14\\7&-3&14\end{bmatrix} = \begin{bmatrix}24&12&6\\-18&48&-84\\42&-18&84\end{bmatrix}$
$5A = 5 \begin{bmatrix}1&1&1\\1&2&−3\\2&−1&3 \end{bmatrix} = \begin{bmatrix}5&5&5\\5&10&-15\\10&-5&15\end{bmatrix}$
$11I = 11 \begin{bmatrix}1&0&0\\0&1&0\\0&0&1 \end{bmatrix} = \begin{bmatrix}11&0&0\\0&11&0\\0&0&11\end{bmatrix}$
Now, substitute these matrices into the expression $A^3 - 6A^2 + 5A + 11I$:
$A^3 - 6A^2 + 5A + 11I = \begin{bmatrix}8&7&1\\-23&27&-69\\32&-13&58\end{bmatrix} - \begin{bmatrix}24&12&6\\-18&48&-84\\42&-18&84\end{bmatrix} + \begin{bmatrix}5&5&5\\5&10&-15\\10&-5&15\end{bmatrix} + \begin{bmatrix}11&0&0\\0&11&0\\0&0&11\end{bmatrix}$
Perform the matrix addition and subtraction element by element:
$A^3 - 6A^2 + 5A + 11I = \begin{bmatrix} 8 - 24 + 5 + 11 & 7 - 12 + 5 + 0 & 1 - 6 + 5 + 0 \\ -23 - (-18) + 5 + 0 & 27 - 48 + 10 + 11 & -69 - (-84) - 15 + 0 \\ 32 - 42 + 10 + 0 & -13 - (-18) - 5 + 0 & 58 - 84 + 15 + 11 \end{bmatrix}$
$A^3 - 6A^2 + 5A + 11I = \begin{bmatrix} 8 - 24 + 5 + 11 & 7 - 12 + 5 & 1 - 6 + 5 \\ -23 + 18 + 5 & 27 - 48 + 10 + 11 & -69 + 84 - 15 \\ 32 - 42 + 10 & -13 + 18 - 5 & 58 - 84 + 15 + 11 \end{bmatrix}$
$A^3 - 6A^2 + 5A + 11I = \begin{bmatrix} 0 & 0 & 0 \\ 0 & 0 & 0 \\ 0 & 0 & 0 \end{bmatrix}$
This is the zero matrix $O$.
Thus, we have shown that $A^3 - 6A^2 + 5A + 11 I = O$.
Finding $A^{-1}$ using the equation:
We have the equation:
$A^3 - 6A^2 + 5A + 11 I = O$
To find $A^{-1}$, we multiply the equation by $A^{-1}$ from the left (or right). We need to ensure $A^{-1}$ exists. The determinant of A is $\det(A) = 1(2 \cdot 3 - (-3) \cdot (-1)) - 1(1 \cdot 3 - (-3) \cdot 2) + 1(1 \cdot (-1) - 2 \cdot 2) = 1(6 - 3) - 1(3 + 6) + 1(-1 - 4) = 1(3) - 1(9) + 1(-5) = 3 - 9 - 5 = -11$. Since $\det(A) = -11 \neq 0$, $A^{-1}$ exists.
Multiplying by $A^{-1}$:
$A^{-1}(A^3 - 6A^2 + 5A + 11 I) = A^{-1}O$
$A^{-1}A^3 - 6A^{-1}A^2 + 5A^{-1}A + 11A^{-1}I = O$
Using the properties $A^{-1}A = I$, $A^{-1}A^k = A^{k-1}$ for $k \geq 1$, and $A^{-1}I = A^{-1}$, and $XO=O$:
$A^2 - 6A + 5I + 11A^{-1} = O$
Now, we solve for $A^{-1}$:
$11A^{-1} = -A^2 + 6A - 5I$
$A^{-1} = \frac{1}{11}(-A^2 + 6A - 5I)$
We have already calculated $A^2$, $6A$, and $5I$. Substitute these values:
$-A^2 = -\begin{bmatrix}4&2&1\\-3&8&-14\\7&-3&14\end{bmatrix} = \begin{bmatrix}-4&-2&-1\\3&-8&14\\-7&3&-14\end{bmatrix}$
$6A = \begin{bmatrix}6&6&6\\6&12&-18\\12&-6&18\end{bmatrix}$
$-5I = -\begin{bmatrix}5&0&0\\0&5&0\\0&0&5\end{bmatrix} = \begin{bmatrix}-5&0&0\\0&-5&0\\0&0&-5\end{bmatrix}$
Now calculate the sum $-A^2 + 6A - 5I$:
$-A^2 + 6A - 5I = \begin{bmatrix}-4&-2&-1\\3&-8&14\\-7&3&-14\end{bmatrix} + \begin{bmatrix}6&6&6\\6&12&-18\\12&-6&18\end{bmatrix} + \begin{bmatrix}-5&0&0\\0&-5&0\\0&0&-5\end{bmatrix}$
$-A^2 + 6A - 5I = \begin{bmatrix} -4 + 6 - 5 & -2 + 6 + 0 & -1 + 6 + 0 \\ 3 + 6 + 0 & -8 + 12 - 5 & 14 - 18 + 0 \\ -7 + 12 + 0 & 3 - 6 + 0 & -14 + 18 - 5 \end{bmatrix}$
$-A^2 + 6A - 5I = \begin{bmatrix} -3 & 4 & 5 \\ 9 & -1 & -4 \\ 5 & -3 & -1 \end{bmatrix}$
Finally, calculate $A^{-1}$:
$A^{-1} = \frac{1}{11} \begin{bmatrix}-3&4&5\\9&-1&-4\\5&-3&-1\end{bmatrix}$
$A^{-1} = \begin{bmatrix}-\frac{3}{11}&\frac{4}{11}&\frac{5}{11}\\\frac{9}{11}&-\frac{1}{11}&-\frac{4}{11}\\\frac{5}{11}&-\frac{3}{11}&-\frac{1}{11}\end{bmatrix}$
Conclusion:
We have shown that $A^3 - 6A^2 + 5A + 11 I = O$.
Using this equation, the inverse of matrix A is found to be:
$A^{-1} = \begin{bmatrix}-\frac{3}{11}&\frac{4}{11}&\frac{5}{11}\\\frac{9}{11}&-\frac{1}{11}&-\frac{4}{11}\\\frac{5}{11}&-\frac{3}{11}&-\frac{1}{11}\end{bmatrix}$
Question 16. If A = $\begin{bmatrix}2&−1&1\\−1&2&−1\\1&−1&2 \end{bmatrix}$
Verify that A3 – 6A2 + 9A – 4I = O and hence find A–1
Answer:
Given:
Matrix $A = \begin{bmatrix}2&−1&1\\−1&2&−1\\1&−1&2 \end{bmatrix}$
$I$ is the identity matrix of order $3 \times 3$, i.e., $I = \begin{bmatrix}1&0&0\\0&1&0\\0&0&1 \end{bmatrix}$.
$O$ is the zero matrix of order $3 \times 3$, i.e., $O = \begin{bmatrix}0&0&0\\0&0&0\\0&0&0 \end{bmatrix}$.
To Verify:
The equation $A^3 – 6A^2 + 9A – 4I = O$.
To Find:
$A^{-1}$ using the given equation.
Solution:
First, we calculate $A^2$:
$A^2 = A \times A = \begin{bmatrix}2&−1&1\\−1&2&−1\\1&−1&2 \end{bmatrix} \begin{bmatrix}2&−1&1\\−1&2&−1\\1&−1&2 \end{bmatrix}$
Performing matrix multiplication:
$A^2 = \begin{bmatrix} (2)(2)+(-1)(-1)+(1)(1) & (2)(-1)+(-1)(2)+(1)(-1) & (2)(1)+(-1)(-1)+(1)(2) \\ (-1)(2)+(2)(-1)+(-1)(1) & (-1)(-1)+(2)(2)+(-1)(-1) & (-1)(1)+(2)(-1)+(-1)(2) \\ (1)(2)+(-1)(-1)+(2)(1) & (1)(-1)+(-1)(2)+(2)(-1) & (1)(1)+(-1)(-1)+(2)(2) \end{bmatrix}$
$A^2 = \begin{bmatrix} 4+1+1 & -2-2-1 & 2+1+2 \\ -2-2-1 & 1+4+1 & -1-2-2 \\ 2+1+2 & -1-2-2 & 1+1+4 \end{bmatrix}$
$A^2 = \begin{bmatrix}6&-5&5\\-5&6&-5\\5&-5&6\end{bmatrix}$
Next, we calculate $A^3$:
$A^3 = A^2 \times A = \begin{bmatrix}6&-5&5\\-5&6&-5\\5&-5&6\end{bmatrix} \begin{bmatrix}2&−1&1\\−1&2&−1\\1&−1&2 \end{bmatrix}$
Performing matrix multiplication:
$A^3 = \begin{bmatrix} (6)(2)+(-5)(-1)+(5)(1) & (6)(-1)+(-5)(2)+(5)(-1) & (6)(1)+(-5)(-1)+(5)(2) \\ (-5)(2)+(6)(-1)+(-5)(1) & (-5)(-1)+(6)(2)+(-5)(-1) & (-5)(1)+(6)(-1)+(-5)(2) \\ (5)(2)+(-5)(-1)+(6)(1) & (5)(-1)+(-5)(2)+(6)(-1) & (5)(1)+(-5)(-1)+(6)(2) \end{bmatrix}$
$A^3 = \begin{bmatrix} 12+5+5 & -6-10-5 & 6+5+10 \\ -10-6-5 & 5+12+5 & -5-6-10 \\ 10+5+6 & -5-10-6 & 5+5+12 \end{bmatrix}$
$A^3 = \begin{bmatrix}22&-21&21\\-21&22&-21\\21&-21&22\end{bmatrix}$
Now, we calculate the scalar multiples $6A^2$, $9A$, and $4I$:
$6A^2 = 6 \begin{bmatrix}6&-5&5\\-5&6&-5\\5&-5&6\end{bmatrix} = \begin{bmatrix}36&-30&30\\-30&36&-30\\30&-30&36\end{bmatrix}$
$9A = 9 \begin{bmatrix}2&−1&1\\−1&2&−1\\1&−1&2 \end{bmatrix} = \begin{bmatrix}18&-9&9\\-9&18&-9\\9&-9&18\end{bmatrix}$
$4I = 4 \begin{bmatrix}1&0&0\\0&1&0\\0&0&1 \end{bmatrix} = \begin{bmatrix}4&0&0\\0&4&0\\0&0&4\end{bmatrix}$
Now, substitute these matrices into the expression $A^3 - 6A^2 + 9A - 4I$:
$A^3 - 6A^2 + 9A - 4I = \begin{bmatrix}22&-21&21\\-21&22&-21\\21&-21&22\end{bmatrix} - \begin{bmatrix}36&-30&30\\-30&36&-30\\30&-30&36\end{bmatrix} + \begin{bmatrix}18&-9&9\\-9&18&-9\\9&-9&18\end{bmatrix} - \begin{bmatrix}4&0&0\\0&4&0\\0&0&4\end{bmatrix}$
Performing the matrix addition and subtraction element by element:
$A^3 - 6A^2 + 9A - 4I = \begin{bmatrix} 22 - 36 + 18 - 4 & -21 - (-30) + (-9) - 0 & 21 - 30 + 9 - 0 \\ -21 - (-30) + (-9) - 0 & 22 - 36 + 18 - 4 & -21 - (-30) + (-9) - 0 \\ 21 - 30 + 9 - 0 & -21 - (-30) + (-9) - 0 & 22 - 36 + 18 - 4 \end{bmatrix}$
$A^3 - 6A^2 + 9A - 4I = \begin{bmatrix} 22 - 36 + 18 - 4 & -21 + 30 - 9 & 21 - 30 + 9 \\ -21 + 30 - 9 & 22 - 36 + 18 - 4 & -21 + 30 - 9 \\ 21 - 30 + 9 & -21 + 30 - 9 & 22 - 36 + 18 - 4 \end{bmatrix}$
$A^3 - 6A^2 + 9A - 4I = \begin{bmatrix} 0 & 0 & 0 \\ 0 & 0 & 0 \\ 0 & 0 & 0 \end{bmatrix}$
This is the zero matrix $O$.
Thus, the equation $A^3 – 6A^2 + 9A – 4I = O$ is successfully verified.
Finding $A^{-1}$ using the equation:
We have the equation:
$A^3 – 6A^2 + 9A – 4I = O$
To find $A^{-1}$, we can multiply the entire equation by $A^{-1}$. First, we check if $A^{-1}$ exists by calculating the determinant of A.
$\det(A) = 2(2 \cdot 2 - (-1)(-1)) - (-1)((-1) \cdot 2 - (-1) \cdot 1) + 1((-1) \cdot (-1) - 2 \cdot 1)$
$\det(A) = 2(4 - 1) + 1(-2 + 1) + 1(1 - 2)$
$\det(A) = 2(3) + 1(-1) + 1(-1) = 6 - 1 - 1 = 4$
Since $\det(A) = 4 \neq 0$, $A^{-1}$ exists.
Multiply the equation $A^3 – 6A^2 + 9A – 4I = O$ by $A^{-1}$ from the left:
$A^{-1}(A^3 – 6A^2 + 9A – 4I) = A^{-1}O$
Using the distributive property and properties of matrix inverse ($A^{-1}A = I$, $A^{-1}A^k = A^{k-1}$, $A^{-1}I = A^{-1}$, $XO = O$):
$A^{-1}A^3 – 6A^{-1}A^2 + 9A^{-1}A – 4A^{-1}I = O$
$A^2 – 6A + 9I – 4A^{-1} = O$
Now, we rearrange the equation to solve for $A^{-1}$:
$4A^{-1} = A^2 – 6A + 9I$
$A^{-1} = \frac{1}{4}(A^2 – 6A + 9I)$
Now, we calculate the matrix $A^2 – 6A + 9I$. We have already calculated $A^2$, $6A$, and $9I$.
$A^2 – 6A + 9I = \begin{bmatrix}6&-5&5\\-5&6&-5\\5&-5&6\end{bmatrix} - \begin{bmatrix}12&-6&6\\-6&12&-6\\6&-6&12\end{bmatrix} + \begin{bmatrix}9&0&0\\0&9&0\\0&0&9\end{bmatrix}$
$A^2 – 6A + 9I = \begin{bmatrix}6 - 12 + 9 & -5 - (-6) + 0 & 5 - 6 + 0 \\ -5 - (-6) + 0 & 6 - 12 + 9 & -5 - (-6) + 0 \\ 5 - 6 + 0 & -5 - (-6) + 0 & 6 - 12 + 9\end{bmatrix}$
$A^2 – 6A + 9I = \begin{bmatrix}6 - 12 + 9 & -5 + 6 & 5 - 6 \\ -5 + 6 & 6 - 12 + 9 & -5 + 6 \\ 5 - 6 & -5 + 6 & 6 - 12 + 9\end{bmatrix}$
$A^2 – 6A + 9I = \begin{bmatrix}3&1&-1\\1&3&1\\-1&1&3\end{bmatrix}$
Finally, calculate $A^{-1}$:
$A^{-1} = \frac{1}{4} \begin{bmatrix}3&1&-1\\1&3&1\\-1&1&3\end{bmatrix}$
$A^{-1} = \begin{bmatrix}\frac{3}{4}&\frac{1}{4}&-\frac{1}{4}\\\frac{1}{4}&\frac{3}{4}&\frac{1}{4}\\-\frac{1}{4}&\frac{1}{4}&\frac{3}{4}\end{bmatrix}$
Conclusion:
The equation $A^3 – 6A^2 + 9A – 4I = O$ is verified.
Using this equation, the inverse of matrix A is found to be:
$A^{-1} = \begin{bmatrix}\frac{3}{4}&\frac{1}{4}&-\frac{1}{4}\\\frac{1}{4}&\frac{3}{4}&\frac{1}{4}\\-\frac{1}{4}&\frac{1}{4}&\frac{3}{4}\end{bmatrix}$
Question 17. Let A be a nonsingular square matrix of order 3 × 3. Then |adj A| is equal to
(A) | A |
(B) | A |2
(C) | A |3
(D) 3 | A |
Answer:
Given:
$A$ is a nonsingular square matrix of order $3 \times 3$.
A nonsingular matrix means its determinant is non-zero, i.e., $|A| \neq 0$.
To Find:
The value of $|\text{adj} A|$.
Solution:
We know the property that for any square matrix $A$ of order $n$, the product of the matrix and its adjoint is given by:
$A (\text{adj} A) = (\text{adj} A) A = |A| I_n$
where $I_n$ is the identity matrix of order $n$.
In this problem, the order of the matrix $A$ is $n=3$. So, $I_n$ is the $3 \times 3$ identity matrix $I_3 = \begin{bmatrix}1&0&0\\0&1&0\\0&0&1\end{bmatrix}$.
Thus, we have:
$A (\text{adj} A) = |A| I_3$
Now, we take the determinant of both sides of the equation:
$|A (\text{adj} A)| = ||A| I_3|$
Using the property of determinants $|AB| = |A| |B|$:
$|A| |\text{adj} A| = ||A| I_3|$
For a scalar $k$ and an identity matrix $I_n$ of order $n$, the determinant is $|k I_n| = k^n$. In this case, the scalar is $|A|$ and the order is $n=3$.
So, $||A| I_3| = |A|^3$.
Substituting this back into the equation:
$|A| |\text{adj} A| = |A|^3$
Since $A$ is a nonsingular matrix, $|A| \neq 0$. We can divide both sides of the equation by $|A|$:
$|\text{adj} A| = \frac{|A|^3}{|A|}$
$|\text{adj} A| = |A|^{3-1}$
$|\text{adj} A| = |A|^2$
The general formula for the determinant of the adjoint of a square matrix $A$ of order $n$ is $|\text{adj} A| = |A|^{n-1}$. For $n=3$, this gives $|\text{adj} A| = |A|^{3-1} = |A|^2$.
Conclusion:
The value of $|\text{adj} A|$ is $|A|^2$. This corresponds to option (B).
The final answer is (B) | A |2.
Question 18. If A is an invertible matrix of order 2, then det (A–1) is equal to
(A) det (A)
(B) $\frac{1}{det (A)}$
(C) 1
(D) 0
Answer:
Given:
$A$ is an invertible square matrix of order $2 \times 2$.
An invertible matrix is also known as a nonsingular matrix, which means its determinant is non-zero, i.e., $\det(A) = |A| \neq 0$.
To Find:
The value of $\det(A^{-1})$.
Solution:
By the definition of an inverse matrix, for an invertible matrix $A$, there exists a matrix $A^{-1}$ such that:
$A A^{-1} = I$
where $I$ is the identity matrix of the same order as $A$. In this case, since $A$ is of order 2, $I = I_2 = \begin{bmatrix}1&0\\0&1\end{bmatrix}$.
Taking the determinant of both sides of the equation $A A^{-1} = I$:
$\det(A A^{-1}) = \det(I)$
Using the property of determinants which states that the determinant of a product of matrices is the product of their determinants, i.e., $\det(AB) = \det(A) \det(B)$, we can write the left side as:
$\det(A) \det(A^{-1}) = \det(I)$
We also know that the determinant of an identity matrix of any order is 1, i.e., $\det(I) = 1$.
Substituting this value into the equation:
$\det(A) \det(A^{-1}) = 1$
Since $A$ is an invertible matrix, $\det(A) \neq 0$. Therefore, we can divide both sides of the equation by $\det(A)$ to solve for $\det(A^{-1})$:
$\det(A^{-1}) = \frac{1}{\det(A)}$
Conclusion:
For an invertible matrix $A$ of order 2 (or any order $n$), the determinant of its inverse $A^{-1}$ is equal to the reciprocal of the determinant of $A$.
$\det(A^{-1}) = \frac{1}{\det(A)}$
This corresponds to option (B).
The final answer is (B) $\frac{1}{det (A)}$.
Example 27 to 29 (Before Exercise 4.6)
Example 27: Solve the system of equations
2x + 5y = 1
3x + 2y = 7
Answer:
Given:
The system of linear equations:
$2x + 5y = 1$
$3x + 2y = 7$
Solution:
We can solve the given system of equations using the matrix method. The system of equations can be written in the matrix form $AX = B$, where:
$A = \begin{bmatrix}2&5\\3&2\end{bmatrix}$, $X = \begin{bmatrix}x\\y\end{bmatrix}$, and $B = \begin{bmatrix}1\\7\end{bmatrix}$
The solution to the matrix equation $AX = B$ is given by $X = A^{-1}B$, provided that the matrix $A$ is invertible (i.e., its determinant is non-zero).
First, we calculate the determinant of matrix $A$:
$\det(A) = (2)(2) - (5)(3)$
$\det(A) = 4 - 15 = -11$
Since $\det(A) = -11 \neq 0$, the matrix $A$ is nonsingular, and its inverse $A^{-1}$ exists.
Next, we find the inverse of matrix $A$. For a $2 \times 2$ matrix $\begin{bmatrix}a&b\\c&d\end{bmatrix}$, the inverse is $\frac{1}{ad-bc}\begin{bmatrix}d&-b\\-c&a\end{bmatrix}$.
$A^{-1} = \frac{1}{\det(A)} \text{adj}(A)$
$A^{-1} = \frac{1}{-11} \begin{bmatrix}2&-5\\-3&2\end{bmatrix}$
Now, we calculate $X = A^{-1}B$:
$X = \begin{bmatrix}x\\y\end{bmatrix} = \frac{1}{-11} \begin{bmatrix}2&-5\\-3&2\end{bmatrix} \begin{bmatrix}1\\7\end{bmatrix}$
Perform the matrix multiplication:
$\begin{bmatrix}x\\y\end{bmatrix} = -\frac{1}{11} \begin{bmatrix}(2)(1) + (-5)(7)\\(-3)(1) + (2)(7)\end{bmatrix}$
$\begin{bmatrix}x\\y\end{bmatrix} = -\frac{1}{11} \begin{bmatrix}2 - 35\\-3 + 14\end{bmatrix}$
$\begin{bmatrix}x\\y\end{bmatrix} = -\frac{1}{11} \begin{bmatrix}-33\\11\end{bmatrix}$
Multiply the scalar $-\frac{1}{11}$ by each element in the matrix:
$\begin{bmatrix}x\\y\end{bmatrix} = \begin{bmatrix}(-\frac{1}{11})(-33)\\(-\frac{1}{11})(11)\end{bmatrix}$
$\begin{bmatrix}x\\y\end{bmatrix} = \begin{bmatrix}\frac{33}{11}\\-\frac{11}{11}\end{bmatrix}$
$\begin{bmatrix}x\\y\end{bmatrix} = \begin{bmatrix}3\\-1\end{bmatrix}$
Equating the corresponding elements, we get the values of $x$ and $y$.
Conclusion:
The solution to the given system of equations is $x = 3$ and $y = -1$.
Example 28: Solve the following system of equations by matrix method.
3x – 2y + 3z = 8
2x + y – z = 1
4x – 3y + 2z = 4
Answer:
Given:
The system of linear equations:
$3x - 2y + 3z = 8$
$2x + y - z = 1$
$4x - 3y + 2z = 4$
Solution:
We will solve the given system of equations using the matrix method. The system can be written in the matrix form $AX = B$, where:
$A = \begin{bmatrix}3&-2&3\\2&1&-1\\4&-3&2\end{bmatrix}$, $X = \begin{bmatrix}x\\y\\z\end{bmatrix}$, and $B = \begin{bmatrix}8\\1\\4\end{bmatrix}$
The solution is given by $X = A^{-1}B$, provided that the matrix $A$ is invertible.
First, we calculate the determinant of $A$ to check if it is invertible:
$\det(A) = 3 \begin{vmatrix}1&-1\\-3&2\end{vmatrix} - (-2) \begin{vmatrix}2&-1\\4&2\end{vmatrix} + 3 \begin{vmatrix}2&1\\4&-3\end{vmatrix}$
$\det(A) = 3((1)(2) - (-1)(-3)) + 2((2)(2) - (-1)(4)) + 3((2)(-3) - (1)(4))$
$\det(A) = 3(2 - 3) + 2(4 + 4) + 3(-6 - 4)$
$\det(A) = 3(-1) + 2(8) + 3(-10)$
$\det(A) = -3 + 16 - 30$
$\det(A) = 13 - 30 = -17$
Since $\det(A) = -17 \neq 0$, the matrix $A$ is nonsingular, and its inverse $A^{-1}$ exists. Thus, the system has a unique solution.
Next, we find the adjoint of $A$, denoted as $\text{adj}(A)$. This is the transpose of the matrix of cofactors of $A$.
Calculate the cofactors $C_{ij}$ of each element $a_{ij}$ in $A$:
$C_{11} = + \begin{vmatrix}1&-1\\-3&2\end{vmatrix} = (1)(2) - (-1)(-3) = 2 - 3 = -1$
$C_{12} = - \begin{vmatrix}2&-1\\4&2\end{vmatrix} = -((2)(2) - (-1)(4)) = -(4 + 4) = -8$
$C_{13} = + \begin{vmatrix}2&1\\4&-3\end{vmatrix} = (2)(-3) - (1)(4) = -6 - 4 = -10$
$C_{21} = - \begin{vmatrix}-2&3\\-3&2\end{vmatrix} = -((-2)(2) - (3)(-3)) = -(-4 + 9) = -5$
$C_{22} = + \begin{vmatrix}3&3\\4&2\end{vmatrix} = (3)(2) - (3)(4) = 6 - 12 = -6$
$C_{23} = - \begin{vmatrix}3&-2\\4&-3\end{vmatrix} = -((3)(-3) - (-2)(4)) = -(-9 + 8) = 1$
$C_{31} = + \begin{vmatrix}-2&3\\1&-1\end{vmatrix} = (-2)(-1) - (3)(1) = 2 - 3 = -1$
$C_{32} = - \begin{vmatrix}3&3\\2&-1\end{vmatrix} = -((3)(-1) - (3)(2)) = -(-3 - 6) = 9$
$C_{33} = + \begin{vmatrix}3&-2\\2&1\end{vmatrix} = (3)(1) - (-2)(2) = 3 + 4 = 7$
The matrix of cofactors is $\begin{bmatrix}-1&-8&-10\\-5&-6&1\\-1&9&7\end{bmatrix}$.
The adjoint of $A$ is the transpose of the cofactor matrix:
$\text{adj}(A) = \begin{bmatrix}-1&-5&-1\\-8&-6&9\\-10&1&7\end{bmatrix}$
Now, we calculate the inverse of $A$:
$A^{-1} = \frac{1}{\det(A)} \text{adj}(A) = \frac{1}{-17} \begin{bmatrix}-1&-5&-1\\-8&-6&9\\-10&1&7\end{bmatrix} = -\frac{1}{17} \begin{bmatrix}-1&-5&-1\\-8&-6&9\\-10&1&7\end{bmatrix} = \frac{1}{17} \begin{bmatrix}1&5&1\\8&6&-9\\10&-1&-7\end{bmatrix}$
Finally, we solve for $X$ using $X = A^{-1}B$:
$X = \begin{bmatrix}x\\y\\z\end{bmatrix} = \frac{1}{17} \begin{bmatrix}1&5&1\\8&6&-9\\10&-1&-7\end{bmatrix} \begin{bmatrix}8\\1\\4\end{bmatrix}$
Perform the matrix multiplication:
$X = \frac{1}{17} \begin{bmatrix} (1)(8) + (5)(1) + (1)(4) \\ (8)(8) + (6)(1) + (-9)(4) \\ (10)(8) + (-1)(1) + (-7)(4) \end{bmatrix}$
$X = \frac{1}{17} \begin{bmatrix} 8 + 5 + 4 \\ 64 + 6 - 36 \\ 80 - 1 - 28 \end{bmatrix}$
$X = \frac{1}{17} \begin{bmatrix} 17 \\ 70 - 36 \\ 79 - 28 \end{bmatrix}$
$X = \frac{1}{17} \begin{bmatrix} 17 \\ 34 \\ 51 \end{bmatrix}$
Multiply by the scalar $\frac{1}{17}$:
$X = \begin{bmatrix} \frac{17}{17} \\ \frac{34}{17} \\ \frac{51}{17} \end{bmatrix} = \begin{bmatrix} 1 \\ 2 \\ 3 \end{bmatrix}$
Equating the elements of the matrix $X$, we get:
$x = 1$
$y = 2$
$z = 3$
Conclusion:
The solution to the given system of equations is $x = 1$, $y = 2$, and $z = 3$.
Example 29: The sum of three numbers is 6. If we multiply third number by 3 and add second number to it, we get 11. By adding first and third numbers, we get double of the second number. Represent it algebraically and find the numbers using matrix method.
Answer:
Given:
The following conditions regarding three numbers:
1. The sum of the three numbers is 6.
2. The sum of the second number and three times the third number is 11.
3. The sum of the first and third numbers is double the second number.
To Find:
The three numbers using the matrix method.
Solution:
Let the three numbers be $x$, $y$, and $z$. We translate the given conditions into a system of linear equations:
From the first condition:
$x + y + z = 6$
From the second condition:
$y + 3z = 11$
From the third condition:
$x + z = 2y$
Rearranging the equations into standard form ($ax + by + cz = d$):
$x + y + z = 6$
$0x + y + 3z = 11$
$x - 2y + z = 0$
This system of equations can be written in the matrix form $AX = B$, where:
$A = \begin{bmatrix}1&1&1\\0&1&3\\1&-2&1\end{bmatrix}$, $X = \begin{bmatrix}x\\y\\z\end{bmatrix}$, and $B = \begin{bmatrix}6\\11\\0\end{bmatrix}$
The solution is given by $X = A^{-1}B$, provided that the matrix $A$ is invertible.
First, we calculate the determinant of $A$:
$\det(A) = 1 \begin{vmatrix}1&3\\-2&1\end{vmatrix} - 1 \begin{vmatrix}0&3\\1&1\end{vmatrix} + 1 \begin{vmatrix}0&1\\1&-2\end{vmatrix}$
$\det(A) = 1((1)(1) - (3)(-2)) - 1((0)(1) - (3)(1)) + 1((0)(-2) - (1)(1))$
$\det(A) = 1(1 + 6) - 1(0 - 3) + 1(0 - 1)$
$\det(A) = 7 + 3 - 1 = 9$
Since $\det(A) = 9 \neq 0$, the matrix $A$ is nonsingular, and its inverse $A^{-1}$ exists. Thus, the system has a unique solution.
Next, we find the adjoint of $A$, $\text{adj}(A)$, which is the transpose of the matrix of cofactors.
The cofactors $C_{ij}$ are:
$C_{11} = \begin{vmatrix}1&3\\-2&1\end{vmatrix} = 7$
$C_{12} = -\begin{vmatrix}0&3\\1&1\end{vmatrix} = 3$
$C_{13} = \begin{vmatrix}0&1\\1&-2\end{vmatrix} = -1$
$C_{21} = -\begin{vmatrix}1&1\\-2&1\end{vmatrix} = -3$
$C_{22} = \begin{vmatrix}1&1\\1&1\end{vmatrix} = 0$
$C_{23} = -\begin{vmatrix}1&1\\1&-2\end{vmatrix} = 3$
$C_{31} = \begin{vmatrix}1&1\\1&3\end{vmatrix} = 2$
$C_{32} = -\begin{vmatrix}1&1\\0&3\end{vmatrix} = -3$
$C_{33} = \begin{vmatrix}1&1\\0&1\end{vmatrix} = 1$
The matrix of cofactors is $\begin{bmatrix}7&3&-1\\-3&0&3\\2&-3&1\end{bmatrix}$.
The adjoint of $A$ is the transpose of this matrix:
$\text{adj}(A) = \begin{bmatrix}7&-3&2\\3&0&-3\\-1&3&1\end{bmatrix}$
Now, we calculate the inverse of $A$:
$A^{-1} = \frac{1}{\det(A)} \text{adj}(A) = \frac{1}{9} \begin{bmatrix}7&-3&2\\3&0&-3\\-1&3&1\end{bmatrix}$
Finally, we solve for $X$ using $X = A^{-1}B$:
$X = \begin{bmatrix}x\\y\\z\end{bmatrix} = \frac{1}{9} \begin{bmatrix}7&-3&2\\3&0&-3\\-1&3&1\end{bmatrix} \begin{bmatrix}6\\11\\0\end{bmatrix}$
Perform the matrix multiplication:
$X = \frac{1}{9} \begin{bmatrix} (7)(6) + (-3)(11) + (2)(0) \\ (3)(6) + (0)(11) + (-3)(0) \\ (-1)(6) + (3)(11) + (1)(0) \end{bmatrix}$
$X = \frac{1}{9} \begin{bmatrix} 42 - 33 + 0 \\ 18 + 0 + 0 \\ -6 + 33 + 0 \end{bmatrix}$
$X = \frac{1}{9} \begin{bmatrix} 9 \\ 18 \\ 27 \end{bmatrix}$
Multiply by the scalar $\frac{1}{9}$:
$X = \begin{bmatrix} \frac{9}{9} \\ \frac{18}{9} \\ \frac{27}{9} \end{bmatrix} = \begin{bmatrix} 1 \\ 2 \\ 3 \end{bmatrix}$
Equating the elements, we find $x=1$, $y=2$, and $z=3$.
Conclusion:
The three numbers are 1, 2, and 3.
Exercise 4.6
Examine the consistency of the system of equations in Exercises 1 to 6
Question 1.
x + 2y = 2
2x + 3y = 3
Answer:
Given:
The system of linear equations:
$x + 2y = 2$
$2x + 3y = 3$
To Examine:
The consistency of the given system of equations.
Solution:
The given system of equations can be written in the matrix form $AX = B$, where:
$A = \begin{bmatrix}1&2\\2&3\end{bmatrix}$, $X = \begin{bmatrix}x\\y\end{bmatrix}$, and $B = \begin{bmatrix}2\\3\end{bmatrix}$
To determine the consistency of the system, we first calculate the determinant of the coefficient matrix $A$.
$\det(A) = (1)(3) - (2)(2)$
$\det(A) = 3 - 4$
$\det(A) = -1$
Since $\det(A) = -1 \neq 0$, the matrix $A$ is a nonsingular matrix.
For a system of linear equations $AX = B$, if $\det(A) \neq 0$, then the system is consistent and has a unique solution given by $X = A^{-1}B$.
Conclusion:
Since the determinant of the coefficient matrix is non-zero, the given system of equations is consistent.
Question 2.
2x – y = 5
x + y = 4
Answer:
Given:
The system of linear equations:
$2x - y = 5$
$x + y = 4$
To Examine:
The consistency of the given system of equations.
Solution:
The given system of equations can be written in the matrix form $AX = B$, where:
$A = \begin{bmatrix}2&-1\\1&1\end{bmatrix}$, $X = \begin{bmatrix}x\\y\end{bmatrix}$, and $B = \begin{bmatrix}5\\4\end{bmatrix}$
To determine the consistency of the system, we calculate the determinant of the coefficient matrix $A$.
$\det(A) = (2)(1) - (-1)(1)$
$\det(A) = 2 - (-1)$
$\det(A) = 2 + 1 = 3$
Since $\det(A) = 3 \neq 0$, the matrix $A$ is a nonsingular matrix.
For a system of linear equations $AX = B$, if $\det(A) \neq 0$, then the matrix $A$ is invertible ($A^{-1}$ exists), and the system is consistent and has a unique solution given by $X = A^{-1}B$.
Conclusion:
Since the determinant of the coefficient matrix is non-zero, the given system of equations is consistent.
Question 3.
x + 3y = 5
2x + 6y = 8
Answer:
Given:
The system of linear equations:
$x + 3y = 5$
$2x + 6y = 8$
To Examine:
The consistency of the given system of equations.
Solution:
We can write the given system of equations in the matrix form $AX = B$, where:
$A = \begin{bmatrix}1&3\\2&6\end{bmatrix}$, $X = \begin{bmatrix}x\\y\end{bmatrix}$, and $B = \begin{bmatrix}5\\8\end{bmatrix}$
To determine the consistency of the system, we first calculate the determinant of the coefficient matrix $A$.
$\det(A) = (1)(6) - (3)(2)$
$\det(A) = 6 - 6 = 0$
Since $\det(A) = 0$, the matrix $A$ is a singular matrix. In this case, we need to calculate the product of the adjoint of $A$ and the matrix $B$, i.e., $(\text{adj} A)B$, to determine the consistency.
First, we find the adjoint of the $2 \times 2$ matrix $A = \begin{bmatrix}a&b\\c&d\end{bmatrix}$, which is $\text{adj}(A) = \begin{bmatrix}d&-b\\-c&a\end{bmatrix}$.
So, $\text{adj}(A) = \begin{bmatrix}6&-3\\-2&1\end{bmatrix}$.
Now, we calculate $(\text{adj} A)B$:
$(\text{adj} A)B = \begin{bmatrix}6&-3\\-2&1\end{bmatrix} \begin{bmatrix}5\\8\end{bmatrix}$
Perform the matrix multiplication:
$(\text{adj} A)B = \begin{bmatrix}(6)(5) + (-3)(8)\\(-2)(5) + (1)(8)\end{bmatrix}$
$(\text{adj} A)B = \begin{bmatrix}30 - 24\\-10 + 8\end{bmatrix}$
$(\text{adj} A)B = \begin{bmatrix}6\\-2\end{bmatrix}$
The zero matrix of the same order is $O = \begin{bmatrix}0\\0\end{bmatrix}$.
We observe that $(\text{adj} A)B = \begin{bmatrix}6\\-2\end{bmatrix} \neq \begin{bmatrix}0\\0\end{bmatrix} = O$.
For a system of linear equations $AX = B$, if $\det(A) = 0$, then:
If $(\text{adj} A)B \neq O$, the system is inconsistent (has no solution).
If $(\text{adj} A)B = O$, the system is consistent (has infinitely many solutions).
In this case, since $\det(A) = 0$ and $(\text{adj} A)B \neq O$, the system is inconsistent.
Conclusion:
Since the determinant of the coefficient matrix is zero and $(\text{adj} A)B$ is not the zero matrix, the given system of equations is inconsistent.
Question 4.
x + y + z = 1
2x + 3y + 2z = 2
ax + ay + 2az = 4
Answer:
Given:
The system of linear equations:
$x + y + z = 1$
$2x + 3y + 2z = 2$
$ax + ay + 2az = 4$
To Examine:
The consistency of the given system of equations.
Solution:
We can write the given system of equations in the matrix form $AX = B$, where:
$A = \begin{bmatrix}1&1&1\\2&3&2\\a&a&2a\end{bmatrix}$, $X = \begin{bmatrix}x\\y\\z\end{bmatrix}$, and $B = \begin{bmatrix}1\\2\\4\end{bmatrix}$
To determine the consistency of the system, we first calculate the determinant of the coefficient matrix $A$.
$\det(A) = 1 \begin{vmatrix}3&2\\a&2a\end{vmatrix} - 1 \begin{vmatrix}2&2\\a&2a\end{vmatrix} + 1 \begin{vmatrix}2&3\\a&a\end{vmatrix}$
$\det(A) = 1((3)(2a) - (2)(a)) - 1((2)(2a) - (2)(a)) + 1((2)(a) - (3)(a))$
$\det(A) = 1(6a - 2a) - 1(4a - 2a) + 1(2a - 3a)$
$\det(A) = 4a - 2a - a = a$
The consistency of the system depends on the value of $\det(A)$.
Case 1: $\det(A) \neq 0$
If $\det(A) \neq 0$, which means $a \neq 0$, then the matrix $A$ is nonsingular. In this case, the inverse $A^{-1}$ exists, and the system $AX = B$ has a unique solution given by $X = A^{-1}B$.
Thus, if $a \neq 0$, the system is consistent.
Case 2: $\det(A) = 0$
If $\det(A) = 0$, which means $a = 0$, then the matrix $A$ is singular. In this case, we need to examine the product $(\text{adj} A)B$.
If $a = 0$, the matrix $A$ becomes $A = \begin{bmatrix}1&1&1\\2&3&2\\0&0&0\end{bmatrix}$.
First, find the adjoint of $A$ when $a=0$. The cofactors are:
$C_{11} = \begin{vmatrix}3&2\\0&0\end{vmatrix} = 0$
$C_{12} = -\begin{vmatrix}2&2\\0&0\end{vmatrix} = 0$
$C_{13} = \begin{vmatrix}2&3\\0&0\end{vmatrix} = 0$
$C_{21} = -\begin{vmatrix}1&1\\0&0\end{vmatrix} = 0$
$C_{22} = \begin{vmatrix}1&1\\0&0\end{vmatrix} = 0$
$C_{23} = -\begin{vmatrix}1&1\\0&0\end{vmatrix} = 0$
$C_{31} = \begin{vmatrix}1&1\\3&2\end{vmatrix} = 1(2)-1(3) = -1$
$C_{32} = -\begin{vmatrix}1&1\\2&2\end{vmatrix} = -(1(2)-1(2)) = 0$
$C_{33} = \begin{vmatrix}1&1\\2&3\end{vmatrix} = 1(3)-1(2) = 1$
The matrix of cofactors is $\begin{bmatrix}0&0&0\\0&0&0\\-1&0&1\end{bmatrix}$.
The adjoint of $A$ is the transpose of this matrix: $\text{adj}(A) = \begin{bmatrix}0&0&-1\\0&0&0\\0&0&1\end{bmatrix}$.
Now calculate $(\text{adj} A)B$:
$(\text{adj} A)B = \begin{bmatrix}0&0&-1\\0&0&0\\0&0&1\end{bmatrix} \begin{bmatrix}1\\2\\4\end{bmatrix}$
$(\text{adj} A)B = \begin{bmatrix} (0)(1) + (0)(2) + (-1)(4) \\ (0)(1) + (0)(2) + (0)(4) \\ (0)(1) + (0)(2) + (1)(4) \end{bmatrix} = \begin{bmatrix} 0 + 0 - 4 \\ 0 + 0 + 0 \\ 0 + 0 + 4 \end{bmatrix} = \begin{bmatrix} -4 \\ 0 \\ 4 \end{bmatrix}$
Since $(\text{adj} A)B = \begin{bmatrix}-4\\0\\4\end{bmatrix} \neq \begin{bmatrix}0\\0\\0\end{bmatrix} = O$ when $a=0$, the system is inconsistent (has no solution) when $a=0$.
Conclusion:
The system of equations is consistent if $a \neq 0$ and inconsistent if $a = 0$.
Question 5.
3x – y – 2z = 2
2y – z = –1
3x – 5y = 3
Answer:
Given:
The system of linear equations:
$3x - y - 2z = 2$
$2y - z = -1$
$3x - 5y = 3$
We can rewrite the equations to explicitly show all variables:
$3x - 1y - 2z = 2$
$0x + 2y - 1z = -1$
$3x - 5y + 0z = 3$
To Examine:
The consistency of the given system of equations.
Solution:
We write the given system of equations in the matrix form $AX = B$, where:
$A = \begin{bmatrix}3&-1&-2\\0&2&-1\\3&-5&0\end{bmatrix}$, $X = \begin{bmatrix}x\\y\\z\end{bmatrix}$, and $B = \begin{bmatrix}2\\-1\\3\end{bmatrix}$
To determine the consistency of the system, we first calculate the determinant of the coefficient matrix $A$.
$\det(A) = 3 \begin{vmatrix}2&-1\\-5&0\end{vmatrix} - (-1) \begin{vmatrix}0&-1\\3&0\end{vmatrix} + (-2) \begin{vmatrix}0&2\\3&-5\end{vmatrix}$
$\det(A) = 3((2)(0) - (-1)(-5)) + 1((0)(0) - (-1)(3)) - 2((0)(-5) - (2)(3))$
$\det(A) = 3(0 - 5) + 1(0 + 3) - 2(0 - 6)$
$\det(A) = 3(-5) + 1(3) - 2(-6)$
$\det(A) = -15 + 3 + 12$
$\det(A) = -15 + 15 = 0$
Since $\det(A) = 0$, the matrix $A$ is a singular matrix. The system is either inconsistent or consistent with infinitely many solutions. To differentiate, we need to calculate $(\text{adj} A)B$.
First, we find the adjoint of $A$, $\text{adj}(A)$. This is the transpose of the matrix of cofactors of $A$.
The cofactors $C_{ij}$ are:
$C_{11} = +\begin{vmatrix}2&-1\\-5&0\end{vmatrix} = (2)(0) - (-1)(-5) = 0 - 5 = -5$
$C_{12} = -\begin{vmatrix}0&-1\\3&0\end{vmatrix} = -((0)(0) - (-1)(3)) = -(0 + 3) = -3$
$C_{13} = +\begin{vmatrix}0&2\\3&-5\end{vmatrix} = (0)(-5) - (2)(3) = 0 - 6 = -6$
$C_{21} = -\begin{vmatrix}-1&-2\\-5&0\end{vmatrix} = -((-1)(0) - (-2)(-5)) = -(0 - 10) = 10$
$C_{22} = +\begin{vmatrix}3&-2\\3&0\end{vmatrix} = (3)(0) - (-2)(3) = 0 - (-6) = 6$
$C_{23} = -\begin{vmatrix}3&-1\\3&-5\end{vmatrix} = -((3)(-5) - (-1)(3)) = -(-15 + 3) = 12$
$C_{31} = +\begin{vmatrix}-1&-2\\2&-1\end{vmatrix} = (-1)(-1) - (-2)(2) = 1 - (-4) = 5$
$C_{32} = -\begin{vmatrix}3&-2\\0&-1\end{vmatrix} = -((3)(-1) - (-2)(0)) = -(-3 - 0) = 3$
$C_{33} = +\begin{vmatrix}3&-1\\0&2\end{vmatrix} = (3)(2) - (-1)(0) = 6 - 0 = 6$
The matrix of cofactors is $\begin{bmatrix}-5&-3&-6\\10&6&12\\5&3&6\end{bmatrix}$.
The adjoint of $A$ is the transpose of the cofactor matrix:
$\text{adj}(A) = \begin{bmatrix}-5&10&5\\-3&6&3\\-6&12&6\end{bmatrix}$
Now, we calculate $(\text{adj} A)B$:
$(\text{adj} A)B = \begin{bmatrix}-5&10&5\\-3&6&3\\-6&12&6\end{bmatrix} \begin{bmatrix}2\\-1\\3\end{bmatrix}$
$(\text{adj} A)B = \begin{bmatrix}(-5)(2) + (10)(-1) + (5)(3)\\(-3)(2) + (6)(-1) + (3)(3)\\(-6)(2) + (12)(-1) + (6)(3)\end{bmatrix}$
$(\text{adj} A)B = \begin{bmatrix}-10 - 10 + 15\\-6 - 6 + 9\\-12 - 12 + 18\end{bmatrix}$
$(\text{adj} A)B = \begin{bmatrix}-20 + 15\\-12 + 9\\-24 + 18\end{bmatrix} = \begin{bmatrix}-5\\-3\\-6\end{bmatrix}$
The zero matrix of order $3 \times 1$ is $O = \begin{bmatrix}0\\0\\0\end{bmatrix}$.
We see that $(\text{adj} A)B = \begin{bmatrix}-5\\-3\\-6\end{bmatrix} \neq \begin{bmatrix}0\\0\\0\end{bmatrix} = O$.
For a system of linear equations $AX = B$ with $\det(A) = 0$:
If $(\text{adj} A)B \neq O$, the system is inconsistent (has no solution).
If $(\text{adj} A)B = O$, the system is consistent (has infinitely many solutions).
In this case, since $\det(A) = 0$ and $(\text{adj} A)B \neq O$, the system is inconsistent.
Conclusion:
Since the determinant of the coefficient matrix is zero and $(\text{adj} A)B$ is not the zero matrix, the given system of equations is inconsistent.
Question 6.
5x – y + 4z = 5
2x + 3y + 5z = 2
5x – 2y + 6z = –1
Answer:
Given:
The system of linear equations:
$5x - y + 4z = 5$
$2x + 3y + 5z = 2$
$5x - 2y + 6z = -1$
To Examine:
The consistency of the given system of equations.
Solution:
We can write the given system of equations in the matrix form $AX = B$, where:
$A = \begin{bmatrix}5&-1&4\\2&3&5\\5&-2&6\end{bmatrix}$, $X = \begin{bmatrix}x\\y\\z\end{bmatrix}$, and $B = \begin{bmatrix}5\\2\\-1\end{bmatrix}$
To determine the consistency of the system, we first calculate the determinant of the coefficient matrix $A$.
$\det(A) = 5 \begin{vmatrix}3&5\\-2&6\end{vmatrix} - (-1) \begin{vmatrix}2&5\\5&6\end{vmatrix} + 4 \begin{vmatrix}2&3\\5&-2\end{vmatrix}$
$\det(A) = 5((3)(6) - (5)(-2)) + 1((2)(6) - (5)(5)) + 4((2)(-2) - (3)(5))$
$\det(A) = 5(18 + 10) + 1(12 - 25) + 4(-4 - 15)$
$\det(A) = 5(28) + 1(-13) + 4(-19)$
$\det(A) = 140 - 13 - 76$
$\det(A) = 140 - 89$
$\det(A) = 51$
Since $\det(A) = 51 \neq 0$, the matrix $A$ is a nonsingular matrix.
For a system of linear equations $AX = B$, if $\det(A) \neq 0$, then the matrix $A$ is invertible ($A^{-1}$ exists), and the system is consistent and has a unique solution given by $X = A^{-1}B$.
Conclusion:
Since the determinant of the coefficient matrix is non-zero, the given system of equations is consistent.
Solve system of linear equations, using matrix method, in Exercises 7 to 14.
Question 7.
5x + 2y = 4
7x + 3y = 5
Answer:
Given:
The system of linear equations:
$5x + 2y = 4$
$7x + 3y = 5$
Solution:
We can write the given system of equations in the matrix form $AX = B$, where:
$A = \begin{bmatrix}5&2\\7&3\end{bmatrix}$, $X = \begin{bmatrix}x\\y\end{bmatrix}$, and $B = \begin{bmatrix}4\\5\end{bmatrix}$
To solve for $X$ using the matrix method, we need to find the inverse of matrix $A$, i.e., $A^{-1}$. The solution is given by $X = A^{-1}B$. First, we calculate the determinant of $A$ to check if it is invertible.
$\det(A) = (5)(3) - (2)(7)$
$\det(A) = 15 - 14 = 1$
Since $\det(A) = 1 \neq 0$, the matrix $A$ is nonsingular, and its inverse $A^{-1}$ exists. The system is consistent and has a unique solution.
Now, we find the inverse of matrix $A$. For a $2 \times 2$ matrix $\begin{bmatrix}a&b\\c&d\end{bmatrix}$, the inverse is $\frac{1}{ad-bc}\begin{bmatrix}d&-b\\-c&a\end{bmatrix}$.
$A^{-1} = \frac{1}{\det(A)} \text{adj}(A)$
$A^{-1} = \frac{1}{1} \begin{bmatrix}3&-2\\-7&5\end{bmatrix} = \begin{bmatrix}3&-2\\-7&5\end{bmatrix}$
Now, we calculate $X = A^{-1}B$:
$X = \begin{bmatrix}x\\y\end{bmatrix} = \begin{bmatrix}3&-2\\-7&5\end{bmatrix} \begin{bmatrix}4\\5\end{bmatrix}$
Perform the matrix multiplication:
$\begin{bmatrix}x\\y\end{bmatrix} = \begin{bmatrix}(3)(4) + (-2)(5)\\(-7)(4) + (5)(5)\end{bmatrix}$
$\begin{bmatrix}x\\y\end{bmatrix} = \begin{bmatrix}12 - 10\\-28 + 25\end{bmatrix}$
$\begin{bmatrix}x\\y\end{bmatrix} = \begin{bmatrix}2\\-3\end{bmatrix}$
Equating the corresponding elements, we get the values of $x$ and $y$.
Conclusion:
The solution to the given system of equations is $x = 2$ and $y = -3$.
Question 8.
2x – y = –2
3x + 4y = 3
Answer:
Given:
The system of linear equations:
$2x - y = -2$
$3x + 4y = 3$
Solution:
We can write the given system of equations in the matrix form $AX = B$, where:
$A = \begin{bmatrix}2&-1\\3&4\end{bmatrix}$, $X = \begin{bmatrix}x\\y\end{bmatrix}$, and $B = \begin{bmatrix}-2\\3\end{bmatrix}$
To solve for $X$ using the matrix method, we need to find the inverse of matrix $A$, i.e., $A^{-1}$. The solution is given by $X = A^{-1}B$. First, we calculate the determinant of $A$ to check if it is invertible.
$\det(A) = (2)(4) - (-1)(3)$
$\det(A) = 8 - (-3)$
$\det(A) = 8 + 3 = 11$
Since $\det(A) = 11 \neq 0$, the matrix $A$ is nonsingular, and its inverse $A^{-1}$ exists. The system is consistent and has a unique solution.
Now, we find the inverse of matrix $A$. For a $2 \times 2$ matrix $\begin{bmatrix}a&b\\c&d\end{bmatrix}$, the inverse is $\frac{1}{ad-bc}\begin{bmatrix}d&-b\\-c&a\end{bmatrix}$.
$A^{-1} = \frac{1}{\det(A)} \text{adj}(A)$
$A^{-1} = \frac{1}{11} \begin{bmatrix}4&1\\-3&2\end{bmatrix}$
Now, we calculate $X = A^{-1}B$:
$X = \begin{bmatrix}x\\y\end{bmatrix} = \frac{1}{11} \begin{bmatrix}4&1\\-3&2\end{bmatrix} \begin{bmatrix}-2\\3\end{bmatrix}$
Perform the matrix multiplication:
$\begin{bmatrix}x\\y\end{bmatrix} = \frac{1}{11} \begin{bmatrix}(4)(-2) + (1)(3)\\(-3)(-2) + (2)(3)\end{bmatrix}$
$\begin{bmatrix}x\\y\end{bmatrix} = \frac{1}{11} \begin{bmatrix}-8 + 3\\6 + 6\end{bmatrix}$
$\begin{bmatrix}x\\y\end{bmatrix} = \frac{1}{11} \begin{bmatrix}-5\\12\end{bmatrix}$
Multiply the scalar $\frac{1}{11}$ by each element in the matrix:
$\begin{bmatrix}x\\y\end{bmatrix} = \begin{bmatrix}\frac{-5}{11}\\\frac{12}{11}\end{bmatrix}$
Equating the corresponding elements, we get the values of $x$ and $y$.
Conclusion:
The solution to the given system of equations is $x = -\frac{5}{11}$ and $y = \frac{12}{11}$.
Question 9.
4x – 3y = 3
3x – 5y = 7
Answer:
Given:
The system of linear equations:
$4x - 3y = 3$
$3x - 5y = 7$
Solution:
We can write the given system of equations in the matrix form $AX = B$, where:
$A = \begin{bmatrix}4&-3\\3&-5\end{bmatrix}$, $X = \begin{bmatrix}x\\y\end{bmatrix}$, and $B = \begin{bmatrix}3\\7\end{bmatrix}$
To solve for $X$ using the matrix method, we need to find the inverse of matrix $A$, i.e., $A^{-1}$. The solution is given by $X = A^{-1}B$. First, we calculate the determinant of $A$ to check if it is invertible.
$\det(A) = (4)(-5) - (-3)(3)$
$\det(A) = -20 - (-9)$
$\det(A) = -20 + 9 = -11$
Since $\det(A) = -11 \neq 0$, the matrix $A$ is nonsingular, and its inverse $A^{-1}$ exists. The system is consistent and has a unique solution.
Now, we find the inverse of matrix $A$. For a $2 \times 2$ matrix $\begin{bmatrix}a&b\\c&d\end{bmatrix}$, the inverse is $\frac{1}{ad-bc}\begin{bmatrix}d&-b\\-c&a\end{bmatrix}$.
$A^{-1} = \frac{1}{\det(A)} \text{adj}(A)$
$A^{-1} = \frac{1}{-11} \begin{bmatrix}-5&3\\-3&4\end{bmatrix}$
Now, we calculate $X = A^{-1}B$:
$X = \begin{bmatrix}x\\y\end{bmatrix} = \frac{1}{-11} \begin{bmatrix}-5&3\\-3&4\end{bmatrix} \begin{bmatrix}3\\7\end{bmatrix}$
Perform the matrix multiplication:
$\begin{bmatrix}x\\y\end{bmatrix} = -\frac{1}{11} \begin{bmatrix}(-5)(3) + (3)(7)\\(-3)(3) + (4)(7)\end{bmatrix}$
$\begin{bmatrix}x\\y\end{bmatrix} = -\frac{1}{11} \begin{bmatrix}-15 + 21\\-9 + 28\end{bmatrix}$
$\begin{bmatrix}x\\y\end{bmatrix} = -\frac{1}{11} \begin{bmatrix}6\\19\end{bmatrix}$
Multiply the scalar $-\frac{1}{11}$ by each element in the matrix:
$\begin{bmatrix}x\\y\end{bmatrix} = \begin{bmatrix}(-\frac{1}{11})(6)\\(-\frac{1}{11})(19)\end{bmatrix}$
$\begin{bmatrix}x\\y\end{bmatrix} = \begin{bmatrix}-\frac{6}{11}\\-\frac{19}{11}\end{bmatrix}$
Equating the corresponding elements, we get the values of $x$ and $y$.
Conclusion:
The solution to the given system of equations is $x = -\frac{6}{11}$ and $y = -\frac{19}{11}$.
Question 10.
5x + 2y = 3
3x + 2y = 5
Answer:
Given:
The system of linear equations:
$5x + 2y = 3$
$3x + 2y = 5$
Solution:
We can write the given system of equations in the matrix form $AX = B$, where:
$A = \begin{bmatrix}5&2\\3&2\end{bmatrix}$, $X = \begin{bmatrix}x\\y\end{bmatrix}$, and $B = \begin{bmatrix}3\\5\end{bmatrix}$
To solve for $X$ using the matrix method, we need to find the inverse of matrix $A$, i.e., $A^{-1}$. The solution is given by $X = A^{-1}B$. First, we calculate the determinant of $A$ to check if it is invertible.
$\det(A) = (5)(2) - (2)(3)$
$\det(A) = 10 - 6 = 4$
Since $\det(A) = 4 \neq 0$, the matrix $A$ is nonsingular, and its inverse $A^{-1}$ exists. The system is consistent and has a unique solution.
Now, we find the inverse of matrix $A$. For a $2 \times 2$ matrix $\begin{bmatrix}a&b\\c&d\end{bmatrix}$, the inverse is $\frac{1}{ad-bc}\begin{bmatrix}d&-b\\-c&a\end{bmatrix}$.
$A^{-1} = \frac{1}{\det(A)} \text{adj}(A)$
$A^{-1} = \frac{1}{4} \begin{bmatrix}2&-2\\-3&5\end{bmatrix}$
Now, we calculate $X = A^{-1}B$:
$X = \begin{bmatrix}x\\y\end{bmatrix} = \frac{1}{4} \begin{bmatrix}2&-2\\-3&5\end{bmatrix} \begin{bmatrix}3\\5\end{bmatrix}$
Perform the matrix multiplication:
$\begin{bmatrix}x\\y\end{bmatrix} = \frac{1}{4} \begin{bmatrix}(2)(3) + (-2)(5)\\(-3)(3) + (5)(5)\end{bmatrix}$
$\begin{bmatrix}x\\y\end{bmatrix} = \frac{1}{4} \begin{bmatrix}6 - 10\\-9 + 25\end{bmatrix}$
$\begin{bmatrix}x\\y\end{bmatrix} = \frac{1}{4} \begin{bmatrix}-4\\16\end{bmatrix}$
Multiply the scalar $\frac{1}{4}$ by each element in the matrix:
$\begin{bmatrix}x\\y\end{bmatrix} = \begin{bmatrix}\frac{-4}{4}\\\frac{16}{4}\end{bmatrix}$
$\begin{bmatrix}x\\y\end{bmatrix} = \begin{bmatrix}-1\\4\end{bmatrix}$
Equating the corresponding elements, we get the values of $x$ and $y$.
Conclusion:
The solution to the given system of equations is $x = -1$ and $y = 4$.
Question 11.
2x + y + z = 1
x – 2y – z = $\frac{3}{2}$
3y – 5z = 9
Answer:
Given:
The system of linear equations:
$2x + y + z = 1$
$x - 2y - z = \frac{3}{2}$
$3y - 5z = 9$
We can write the equations with explicit coefficients for all variables:
$2x + 1y + 1z = 1$
$1x - 2y - 1z = \frac{3}{2}$
$0x + 3y - 5z = 9$
Solution:
We can write the given system of equations in the matrix form $AX = B$, where:
$A = \begin{bmatrix}2&1&1\\1&-2&-1\\0&3&-5\end{bmatrix}$, $X = \begin{bmatrix}x\\y\\z\end{bmatrix}$, and $B = \begin{bmatrix}1\\\frac{3}{2}\\9\end{bmatrix}$
To solve for $X$ using the matrix method, we need to find the inverse of matrix $A$, i.e., $A^{-1}$. The solution is given by $X = A^{-1}B$. First, we calculate the determinant of $A$ to check if it is invertible.
$\det(A) = 2 \begin{vmatrix}-2&-1\\3&-5\end{vmatrix} - 1 \begin{vmatrix}1&-1\\0&-5\end{vmatrix} + 1 \begin{vmatrix}1&-2\\0&3\end{vmatrix}$
$\det(A) = 2((-2)(-5) - (-1)(3)) - 1((1)(-5) - (-1)(0)) + 1((1)(3) - (-2)(0))$
$\det(A) = 2(10 + 3) - 1(-5 - 0) + 1(3 - 0)$
$\det(A) = 2(13) - 1(-5) + 1(3)$
$\det(A) = 26 + 5 + 3 = 34$
Since $\det(A) = 34 \neq 0$, the matrix $A$ is nonsingular, and its inverse $A^{-1}$ exists. The system is consistent and has a unique solution.
Next, we find the adjoint of $A$, $\text{adj}(A)$. This is the transpose of the matrix of cofactors of $A$.
The cofactors $C_{ij}$ are:
$C_{11} = +\begin{vmatrix}-2&-1\\3&-5\end{vmatrix} = (-2)(-5) - (-1)(3) = 10 + 3 = 13$
$C_{12} = -\begin{vmatrix}1&-1\\0&-5\end{vmatrix} = -((1)(-5) - (-1)(0)) = -(-5 - 0) = 5$
$C_{13} = +\begin{vmatrix}1&-2\\0&3\end{vmatrix} = (1)(3) - (-2)(0) = 3 - 0 = 3$
$C_{21} = -\begin{vmatrix}1&1\\3&-5\end{vmatrix} = -((1)(-5) - (1)(3)) = -(-5 - 3) = 8$
$C_{22} = +\begin{vmatrix}2&1\\0&-5\end{vmatrix} = (2)(-5) - (1)(0) = -10 - 0 = -10$
$C_{23} = -\begin{vmatrix}2&1\\0&3\end{vmatrix} = -((2)(3) - (1)(0)) = -(6 - 0) = -6$
$C_{31} = +\begin{vmatrix}1&1\\-2&-1\end{vmatrix} = (1)(-1) - (1)(-2) = -1 + 2 = 1$
$C_{32} = -\begin{vmatrix}2&1\\1&-1\end{vmatrix} = -((2)(-1) - (1)(1)) = -(-2 - 1) = 3$
$C_{33} = +\begin{vmatrix}2&1\\1&-2\end{vmatrix} = (2)(-2) - (1)(1) = -4 - 1 = -5$
The matrix of cofactors is $\begin{bmatrix}13&5&3\\8&-10&-6\\1&3&-5\end{bmatrix}$.
The adjoint of $A$ is the transpose of the cofactor matrix:
$\text{adj}(A) = \begin{bmatrix}13&8&1\\5&-10&3\\3&-6&-5\end{bmatrix}$
Now, we calculate the inverse of $A$:
$A^{-1} = \frac{1}{\det(A)} \text{adj}(A) = \frac{1}{34} \begin{bmatrix}13&8&1\\5&-10&3\\3&-6&-5\end{bmatrix}$
Finally, we solve for $X$ using $X = A^{-1}B$:
$X = \begin{bmatrix}x\\y\\z\end{bmatrix} = \frac{1}{34} \begin{bmatrix}13&8&1\\5&-10&3\\3&-6&-5\end{bmatrix} \begin{bmatrix}1\\\frac{3}{2}\\9\end{bmatrix}$
Perform the matrix multiplication:
$X = \frac{1}{34} \begin{bmatrix} (13)(1) + (8)(\frac{3}{2}) + (1)(9) \\ (5)(1) + (-10)(\frac{3}{2}) + (3)(9) \\ (3)(1) + (-6)(\frac{3}{2}) + (-5)(9) \end{bmatrix}$
$X = \frac{1}{34} \begin{bmatrix} 13 + 12 + 9 \\ 5 - 15 + 27 \\ 3 - 9 - 45 \end{bmatrix}$
$X = \frac{1}{34} \begin{bmatrix} 34 \\ 17 \\ -51 \end{bmatrix}$
Multiply by the scalar $\frac{1}{34}$:
$X = \begin{bmatrix} \frac{34}{34} \\ \frac{17}{34} \\ \frac{-51}{34} \end{bmatrix} = \begin{bmatrix} 1 \\ \frac{1}{2} \\ -\frac{3}{2} \end{bmatrix}$
Equating the elements, we find $x=1$, $y=\frac{1}{2}$, and $z=-\frac{3}{2}$.
Conclusion:
The solution to the given system of equations is $x = 1$, $y = \frac{1}{2}$, and $z = -\frac{3}{2}$.
Question 12.
x – y + z = 4
2x + y – 3z = 0
x + y + z = 2
Answer:
Given:
The system of linear equations:
$x - y + z = 4$
$2x + y - 3z = 0$
$x + y + z = 2$
Solution:
We can write the given system of equations in the matrix form $AX = B$, where:
$A = \begin{bmatrix}1&-1&1\\2&1&-3\\1&1&1\end{bmatrix}$, $X = \begin{bmatrix}x\\y\\z\end{bmatrix}$, and $B = \begin{bmatrix}4\\0\\2\end{bmatrix}$
To solve for $X$ using the matrix method, we need to find the inverse of matrix $A$, i.e., $A^{-1}$. The solution is given by $X = A^{-1}B$. First, we calculate the determinant of $A$ to check if it is invertible.
$\det(A) = 1 \begin{vmatrix}1&-3\\1&1\end{vmatrix} - (-1) \begin{vmatrix}2&-3\\1&1\end{vmatrix} + 1 \begin{vmatrix}2&1\\1&1\end{vmatrix}$
$\det(A) = 1((1)(1) - (-3)(1)) + 1((2)(1) - (-3)(1)) + 1((2)(1) - (1)(1))$
$\det(A) = 1(1 + 3) + 1(2 + 3) + 1(2 - 1)$
$\det(A) = 4 + 5 + 1 = 10$
Since $\det(A) = 10 \neq 0$, the matrix $A$ is nonsingular, and its inverse $A^{-1}$ exists. The system is consistent and has a unique solution.
Next, we find the adjoint of $A$, $\text{adj}(A)$. This is the transpose of the matrix of cofactors of $A$.
The cofactors $C_{ij}$ are:
$C_{11} = +\begin{vmatrix}1&-3\\1&1\end{vmatrix} = (1)(1) - (-3)(1) = 1 + 3 = 4$
$C_{12} = -\begin{vmatrix}2&-3\\1&1\end{vmatrix} = -((2)(1) - (-3)(1)) = -(2 + 3) = -5$
$C_{13} = +\begin{vmatrix}2&1\\1&1\end{vmatrix} = (2)(1) - (1)(1) = 2 - 1 = 1$
$C_{21} = -\begin{vmatrix}-1&1\\1&1\end{vmatrix} = -((-1)(1) - (1)(1)) = -(-1 - 1) = 2$
$C_{22} = +\begin{vmatrix}1&1\\1&1\end{vmatrix} = (1)(1) - (1)(1) = 1 - 1 = 0$
$C_{23} = -\begin{vmatrix}1&-1\\1&1\end{vmatrix} = -((1)(1) - (-1)(1)) = -(1 + 1) = -2$
$C_{31} = +\begin{vmatrix}-1&1\\1&-3\end{vmatrix} = (-1)(-3) - (1)(1) = 3 - 1 = 2$
$C_{32} = -\begin{vmatrix}1&1\\2&-3\end{vmatrix} = -((1)(-3) - (1)(2)) = -(-3 - 2) = 5$
$C_{33} = +\begin{vmatrix}1&-1\\2&1\end{vmatrix} = (1)(1) - (-1)(2) = 1 + 2 = 3$
The matrix of cofactors is $\begin{bmatrix}4&-5&1\\2&0&-2\\2&5&3\end{bmatrix}$.
The adjoint of $A$ is the transpose of the cofactor matrix:
$\text{adj}(A) = \begin{bmatrix}4&2&2\\-5&0&5\\1&-2&3\end{bmatrix}$
Now, we calculate the inverse of $A$:
$A^{-1} = \frac{1}{\det(A)} \text{adj}(A) = \frac{1}{10} \begin{bmatrix}4&2&2\\-5&0&5\\1&-2&3\end{bmatrix}$
Finally, we solve for $X$ using $X = A^{-1}B$:
$X = \begin{bmatrix}x\\y\\z\end{bmatrix} = \frac{1}{10} \begin{bmatrix}4&2&2\\-5&0&5\\1&-2&3\end{bmatrix} \begin{bmatrix}4\\0\\2\end{bmatrix}$
Perform the matrix multiplication:
$X = \frac{1}{10} \begin{bmatrix} (4)(4) + (2)(0) + (2)(2) \\ (-5)(4) + (0)(0) + (5)(2) \\ (1)(4) + (-2)(0) + (3)(2) \end{bmatrix}$
$X = \frac{1}{10} \begin{bmatrix} 16 + 0 + 4 \\ -20 + 0 + 10 \\ 4 + 0 + 6 \end{bmatrix}$
$X = \frac{1}{10} \begin{bmatrix} 20 \\ -10 \\ 10 \end{bmatrix}$
Multiply by the scalar $\frac{1}{10}$:
$X = \begin{bmatrix} \frac{20}{10} \\ \frac{-10}{10} \\ \frac{10}{10} \end{bmatrix} = \begin{bmatrix} 2 \\ -1 \\ 1 \end{bmatrix}$
Equating the elements, we find $x=2$, $y=-1$, and $z=1$.
Conclusion:
The solution to the given system of equations is $x = 2$, $y = -1$, and $z = 1$.
Question 13.
2x + 3y + 3z = 5
x – 2y + z = –4
3x – y – 2z = 3
Answer:
Given:
The system of linear equations:
$2x + 3y + 3z = 5$
$x - 2y + z = -4$
$3x - y - 2z = 3$
Solution:
We can write the given system of equations in the matrix form $AX = B$, where:
$A = \begin{bmatrix}2&3&3\\1&-2&1\\3&-1&-2\end{bmatrix}$, $X = \begin{bmatrix}x\\y\\z\end{bmatrix}$, and $B = \begin{bmatrix}5\\-4\\3\end{bmatrix}$
To solve for $X$ using the matrix method, we need to find the inverse of matrix $A$, i.e., $A^{-1}$. The solution is given by $X = A^{-1}B$. First, we calculate the determinant of $A$ to check if it is invertible.
$\det(A) = 2 \begin{vmatrix}-2&1\\-1&-2\end{vmatrix} - 3 \begin{vmatrix}1&1\\3&-2\end{vmatrix} + 3 \begin{vmatrix}1&-2\\3&-1\end{vmatrix}$
$\det(A) = 2((-2)(-2) - (1)(-1)) - 3((1)(-2) - (1)(3)) + 3((1)(-1) - (-2)(3))$
$\det(A) = 2(4 + 1) - 3(-2 - 3) + 3(-1 + 6)$
$\det(A) = 2(5) - 3(-5) + 3(5)$
$\det(A) = 10 + 15 + 15 = 40$
Since $\det(A) = 40 \neq 0$, the matrix $A$ is nonsingular, and its inverse $A^{-1}$ exists. The system is consistent and has a unique solution.
Next, we find the adjoint of $A$, $\text{adj}(A)$. This is the transpose of the matrix of cofactors of $A$.
The cofactors $C_{ij}$ are:
$C_{11} = +\begin{vmatrix}-2&1\\-1&-2\end{vmatrix} = (-2)(-2) - (1)(-1) = 4+1 = 5$
$C_{12} = -\begin{vmatrix}1&1\\3&-2\end{vmatrix} = -((1)(-2) - (1)(3)) = -(-2-3) = 5$
$C_{13} = +\begin{vmatrix}1&-2\\3&-1\end{vmatrix} = (1)(-1) - (-2)(3) = -1+6 = 5$
$C_{21} = -\begin{vmatrix}3&3\\-1&-2\end{vmatrix} = -((3)(-2) - (3)(-1)) = -(-6+3) = 3$
$C_{22} = +\begin{vmatrix}2&3\\3&-2\end{vmatrix} = (2)(-2) - (3)(3) = -4-9 = -13$
$C_{23} = -\begin{vmatrix}2&3\\3&-1\end{vmatrix} = -((2)(-1) - (3)(3)) = -(-2-9) = 11$
$C_{31} = +\begin{vmatrix}3&3\\-2&1\end{vmatrix} = (3)(1) - (3)(-2) = 3+6 = 9$
$C_{32} = -\begin{vmatrix}2&3\\1&1\end{vmatrix} = -((2)(1) - (3)(1)) = -(2-3) = 1$
$C_{33} = +\begin{vmatrix}2&3\\1&-2\end{vmatrix} = (2)(-2) - (3)(1) = -4-3 = -7$
The matrix of cofactors is $\begin{bmatrix}5&5&5\\3&-13&11\\9&1&-7\end{bmatrix}$.
The adjoint of $A$ is the transpose of the cofactor matrix:
$\text{adj}(A) = \begin{bmatrix}5&3&9\\5&-13&1\\5&11&-7\end{bmatrix}$
Now, we calculate the inverse of $A$:
$A^{-1} = \frac{1}{\det(A)} \text{adj}(A) = \frac{1}{40} \begin{bmatrix}5&3&9\\5&-13&1\\5&11&-7\end{bmatrix}$
Finally, we solve for $X$ using $X = A^{-1}B$:
$X = \begin{bmatrix}x\\y\\z\end{bmatrix} = \frac{1}{40} \begin{bmatrix}5&3&9\\5&-13&1\\5&11&-7\end{bmatrix} \begin{bmatrix}5\\-4\\3\end{bmatrix}$
Perform the matrix multiplication:
$X = \frac{1}{40} \begin{bmatrix} (5)(5) + (3)(-4) + (9)(3) \\ (5)(5) + (-13)(-4) + (1)(3) \\ (5)(5) + (11)(-4) + (-7)(3) \end{bmatrix}$
$X = \frac{1}{40} \begin{bmatrix} 25 - 12 + 27 \\ 25 + 52 + 3 \\ 25 - 44 - 21 \end{bmatrix}$
$X = \frac{1}{40} \begin{bmatrix} 40 \\ 80 \\ -40 \end{bmatrix}$
Multiply by the scalar $\frac{1}{40}$:
$X = \begin{bmatrix} \frac{40}{40} \\ \frac{80}{40} \\ \frac{-40}{40} \end{bmatrix} = \begin{bmatrix} 1 \\ 2 \\ -1 \end{bmatrix}$
Equating the elements, we find $x=1$, $y=2$, and $z=-1$.
Conclusion:
The solution to the given system of equations is $x = 1$, $y = 2$, and $z = -1$.
Question 14.
x – y + 2z = 7
3x + 4y – 5z = – 5
2x – y + 3z = 12
Answer:
Given:
The system of linear equations:
$x - y + 2z = 7$
$3x + 4y - 5z = -5$
$2x - y + 3z = 12$
Solution:
We can write the given system of equations in the matrix form $AX = B$, where:
$A = \begin{bmatrix}1&-1&2\\3&4&-5\\2&-1&3\end{bmatrix}$, $X = \begin{bmatrix}x\\y\\z\end{bmatrix}$, and $B = \begin{bmatrix}7\\-5\\12\end{bmatrix}$
To solve for $X$ using the matrix method, we need to find the inverse of matrix $A$, i.e., $A^{-1}$. The solution is given by $X = A^{-1}B$. First, we calculate the determinant of $A$ to check if it is invertible.
$\det(A) = 1 \begin{vmatrix}4&-5\\-1&3\end{vmatrix} - (-1) \begin{vmatrix}3&-5\\2&3\end{vmatrix} + 2 \begin{vmatrix}3&4\\2&-1\end{vmatrix}$
$\det(A) = 1((4)(3) - (-5)(-1)) + 1((3)(3) - (-5)(2)) + 2((3)(-1) - (4)(2))$
$\det(A) = 1(12 - 5) + 1(9 - (-10)) + 2(-3 - 8)$
$\det(A) = 1(7) + 1(9 + 10) + 2(-11)$
$\det(A) = 7 + 19 - 22 = 4$
Since $\det(A) = 4 \neq 0$, the matrix $A$ is nonsingular, and its inverse $A^{-1}$ exists. The system is consistent and has a unique solution.
Next, we find the adjoint of $A$, $\text{adj}(A)$. This is the transpose of the matrix of cofactors of $A$.
The cofactors $C_{ij}$ are:
$C_{11} = +\begin{vmatrix}4&-5\\-1&3\end{vmatrix} = (4)(3) - (-5)(-1) = 12 - 5 = 7$
$C_{12} = -\begin{vmatrix}3&-5\\2&3\end{vmatrix} = -((3)(3) - (-5)(2)) = -(9 + 10) = -19$
$C_{13} = +\begin{vmatrix}3&4\\2&-1\end{vmatrix} = (3)(-1) - (4)(2) = -3 - 8 = -11$
$C_{21} = -\begin{vmatrix}-1&2\\-1&3\end{vmatrix} = -((-1)(3) - (2)(-1)) = -(-3 + 2) = 1$
$C_{22} = +\begin{vmatrix}1&2\\2&3\end{vmatrix} = (1)(3) - (2)(2) = 3 - 4 = -1$
$C_{23} = -\begin{vmatrix}1&-1\\2&-1\end{vmatrix} = -((1)(-1) - (-1)(2)) = -(-1 + 2) = -1$
$C_{31} = +\begin{vmatrix}-1&2\\4&-5\end{vmatrix} = (-1)(-5) - (2)(4) = 5 - 8 = -3$
$C_{32} = -\begin{vmatrix}1&2\\3&-5\end{vmatrix} = -((1)(-5) - (2)(3)) = -(-5 - 6) = 11$
$C_{33} = +\begin{vmatrix}1&-1\\3&4\end{vmatrix} = (1)(4) - (-1)(3) = 4 + 3 = 7$
The matrix of cofactors is $\begin{bmatrix}7&-19&-11\\1&-1&-1\\-3&11&7\end{bmatrix}$.
The adjoint of $A$ is the transpose of the cofactor matrix:
$\text{adj}(A) = \begin{bmatrix}7&1&-3\\-19&-1&11\\-11&-1&7\end{bmatrix}$
Now, we calculate the inverse of $A$:
$A^{-1} = \frac{1}{\det(A)} \text{adj}(A) = \frac{1}{4} \begin{bmatrix}7&1&-3\\-19&-1&11\\-11&-1&7\end{bmatrix}$
Finally, we solve for $X$ using $X = A^{-1}B$:
$X = \begin{bmatrix}x\\y\\z\end{bmatrix} = \frac{1}{4} \begin{bmatrix}7&1&-3\\-19&-1&11\\-11&-1&7\end{bmatrix} \begin{bmatrix}7\\-5\\12\end{bmatrix}$
Perform the matrix multiplication:
$X = \frac{1}{4} \begin{bmatrix} (7)(7) + (1)(-5) + (-3)(12) \\ (-19)(7) + (-1)(-5) + (11)(12) \\ (-11)(7) + (-1)(-5) + (7)(12) \end{bmatrix}$
$X = \frac{1}{4} \begin{bmatrix} 49 - 5 - 36 \\ -133 + 5 + 132 \\ -77 + 5 + 84 \end{bmatrix}$
$X = \frac{1}{4} \begin{bmatrix} 49 - 41 \\ -133 + 137 \\ -77 + 89 \end{bmatrix}$
$X = \frac{1}{4} \begin{bmatrix} 8 \\ 4 \\ 12 \end{bmatrix}$
Multiply by the scalar $\frac{1}{4}$:
$X = \begin{bmatrix} \frac{8}{4} \\ \frac{4}{4} \\ \frac{12}{4} \end{bmatrix} = \begin{bmatrix} 2 \\ 1 \\ 3 \end{bmatrix}$
Equating the elements, we find $x=2$, $y=1$, and $z=3$.
Conclusion:
The solution to the given system of equations is $x = 2$, $y = 1$, and $z = 3$.
Question 15. If A = $\begin{bmatrix}2&−3&5\\3&2&−4\\1&1&−2 \end{bmatrix}$ , find A–1. Using A–1 solve the system of equations
2x – 3y + 5z = 11
3x + 2y – 4z = – 5
x + y – 2z = – 3
Answer:
Given:
Matrix $A = \begin{bmatrix}2&−3&5\\3&2&−4\\1&1&−2 \end{bmatrix}$
System of equations:
$2x – 3y + 5z = 11$
$3x + 2y – 4z = – 5$
$x + y – 2z = – 3$
To Find:
The inverse of matrix A ($A^{-1}$).
The solution to the system of equations using $A^{-1}$.
Solution:
First, we find the inverse of matrix $A$. To do this, we need the determinant of $A$ and the adjoint of $A$.
Calculate $\det(A)$:
$\det(A) = 2 \begin{vmatrix}2&−4\\1&−2\end{vmatrix} - (−3) \begin{vmatrix}3&−4\\1&−2\end{vmatrix} + 5 \begin{vmatrix}3&2\\1&1\end{vmatrix}$
$\det(A) = 2((2)(−2) − (−4)(1)) + 3((3)(−2) − (−4)(1)) + 5((3)(1) − (2)(1))$
$\det(A) = 2(−4 + 4) + 3(−6 + 4) + 5(3 − 2)$
$\det(A) = 2(0) + 3(−2) + 5(1) = 0 - 6 + 5 = -1$
Since $\det(A) = -1 \neq 0$, $A$ is invertible, and $A^{-1}$ exists.
Next, find the matrix of cofactors of $A$:
$C_{11} = +\begin{vmatrix}2&-4\\1&-2\end{vmatrix} = 2(-2) - (-4)(1) = -4 + 4 = 0$
$C_{12} = -\begin{vmatrix}3&-4\\1&-2\end{vmatrix} = -[3(-2) - (-4)(1)] = -[-6 + 4] = 2$
$C_{13} = +\begin{vmatrix}3&2\\1&1\end{vmatrix} = 3(1) - 2(1) = 3 - 2 = 1$
$C_{21} = -\begin{vmatrix}-3&5\\1&-2\end{vmatrix} = -[(-3)(-2) - 5(1)] = -[6 - 5] = -1$
$C_{22} = +\begin{vmatrix}2&5\\1&-2\end{vmatrix} = 2(-2) - 5(1) = -4 - 5 = -9$
$C_{23} = -\begin{vmatrix}2&-3\\1&1\end{vmatrix} = -[2(1) - (-3)(1)] = -[2 + 3] = -5$
$C_{31} = +\begin{vmatrix}-3&5\\2&-4\end{vmatrix} = (-3)(-4) - 5(2) = 12 - 10 = 2$
$C_{32} = -\begin{vmatrix}2&5\\3&-4\end{vmatrix} = -[2(-4) - 5(3)] = -[-8 - 15] = 23$
$C_{33} = +\begin{vmatrix}2&-3\\3&2\end{vmatrix} = 2(2) - (-3)(3) = 4 + 9 = 13$
The matrix of cofactors is $\begin{bmatrix}0&2&1\\-1&-9&-5\\2&23&13\end{bmatrix}$.
The adjoint of $A$ is the transpose of the cofactor matrix:
$\text{adj}(A) = \begin{bmatrix}0&-1&2\\2&-9&23\\1&-5&13\end{bmatrix}$
Now, calculate $A^{-1}$:
$A^{-1} = \frac{1}{\det(A)} \text{adj}(A) = \frac{1}{-1} \begin{bmatrix}0&-1&2\\2&-9&23\\1&-5&13\end{bmatrix} = -1 \begin{bmatrix}0&-1&2\\2&-9&23\\1&-5&13\end{bmatrix}$
$A^{-1} = \begin{bmatrix}0&1&-2\\-2&9&-23\\-1&5&-13\end{bmatrix}$
Now, we use $A^{-1}$ to solve the system of equations. The system can be written in matrix form $AX = B$, where:
$A = \begin{bmatrix}2&−3&5\\3&2&−4\\1&1&−2 \end{bmatrix}$, $X = \begin{bmatrix}x\\y\\z\end{bmatrix}$, and $B = \begin{bmatrix}11\\-5\\-3\end{bmatrix}$
We have already found the inverse of $A$. The solution is given by $X = A^{-1}B$.
$X = \begin{bmatrix}x\\y\\z\end{bmatrix} = \begin{bmatrix}0&1&-2\\-2&9&-23\\-1&5&-13\end{bmatrix} \begin{bmatrix}11\\-5\\-3\end{bmatrix}$
Perform the matrix multiplication:
$X = \begin{bmatrix} (0)(11) + (1)(-5) + (-2)(-3) \\ (-2)(11) + (9)(-5) + (-23)(-3) \\ (-1)(11) + (5)(-5) + (-13)(-3) \end{bmatrix}$
$X = \begin{bmatrix} 0 - 5 + 6 \\ -22 - 45 + 69 \\ -11 - 25 + 39 \end{bmatrix}$
$X = \begin{bmatrix} 1 \\ -67 + 69 \\ -36 + 39 \end{bmatrix}$
$X = \begin{bmatrix}1\\2\\3\end{bmatrix}$
Equating the corresponding elements, we find the values of $x$, $y$, and $z$.
Conclusion:
The inverse of matrix A is $A^{-1} = \begin{bmatrix}0&1&-2\\-2&9&-23\\-1&5&-13\end{bmatrix}$.
The solution to the given system of equations is $x = 1$, $y = 2$, and $z = 3$.
Question 16. The cost of 4 kg onion, 3 kg wheat and 2 kg rice is ₹ 60. The cost of 2 kg onion, 4 kg wheat and 6 kg rice is ₹ 90. The cost of 6 kg onion 2 kg wheat and 3 kg rice is ₹ 70. Find cost of each item per kg by matrix method.
Answer:
Given:
The cost of 4 kg onion, 3 kg wheat and 2 kg rice is $\textsf{₹} 60$.
The cost of 2 kg onion, 4 kg wheat and 6 kg rice is $\textsf{₹} 90$.
The cost of 6 kg onion, 2 kg wheat and 3 kg rice is $\textsf{₹} 70$.
To Find:
The cost of each item per kg by matrix method.
Solution:
Let the cost per kg of onion, wheat, and rice be $x$, $y$, and $z$ respectively.
From the given information, we can form the following system of linear equations:
$4x + 3y + 2z = 60$
$2x + 4y + 6z = 90$
$6x + 2y + 3z = 70$
The second equation can be simplified by dividing by 2:
$x + 2y + 3z = 45$
The system of equations is:
$4x + 3y + 2z = 60$
$x + 2y + 3z = 45$
$6x + 2y + 3z = 70$
We can write this system in matrix form $AX = B$, where:
$A = \begin{pmatrix} 4 & 3 & 2 \\ 1 & 2 & 3 \\ 6 & 2 & 3 \end{pmatrix}$, $X = \begin{pmatrix} x \\ y \\ z \end{pmatrix}$, $B = \begin{pmatrix} 60 \\ 45 \\ 70 \end{pmatrix}$
So, the matrix equation is:
$\begin{pmatrix} 4 & 3 & 2 \\ 1 & 2 & 3 \\ 6 & 2 & 3 \end{pmatrix} \begin{pmatrix} x \\ y \\ z \end{pmatrix} = \begin{pmatrix} 60 \\ 45 \\ 70 \end{pmatrix}$
To solve for $X$, we need to find the inverse of matrix $A$, $A^{-1}$.
First, calculate the determinant of $A$, $\det(A)$.
$\det(A) = 4 \begin{vmatrix} 2 & 3 \\ 2 & 3 \end{vmatrix} - 3 \begin{vmatrix} 1 & 3 \\ 6 & 3 \end{vmatrix} + 2 \begin{vmatrix} 1 & 2 \\ 6 & 2 \end{vmatrix}$
$\det(A) = 4(2 \times 3 - 3 \times 2) - 3(1 \times 3 - 3 \times 6) + 2(1 \times 2 - 2 \times 6)$
$\det(A) = 4(6 - 6) - 3(3 - 18) + 2(2 - 12)$
$\det(A) = 4(0) - 3(-15) + 2(-10)$
$\det(A) = 0 + 45 - 20$
$\det(A) = 25$
Since $\det(A) \neq 0$, the inverse matrix $A^{-1}$ exists.
Next, find the cofactor matrix of $A$. The cofactors are:
$C_{11} = +(2 \times 3 - 3 \times 2) = 0$
$C_{12} = -(1 \times 3 - 3 \times 6) = -(3 - 18) = 15$
$C_{13} = +(1 \times 2 - 2 \times 6) = +(2 - 12) = -10$
$C_{21} = -(3 \times 3 - 2 \times 2) = -(9 - 4) = -5$
$C_{22} = +(4 \times 3 - 2 \times 6) = +(12 - 12) = 0$
$C_{23} = -(4 \times 2 - 3 \times 6) = -(8 - 18) = 10$
$C_{31} = +(3 \times 3 - 2 \times 2) = +(9 - 4) = 5$
$C_{32} = -(4 \times 3 - 2 \times 1) = -(12 - 2) = -10$
$C_{33} = +(4 \times 2 - 3 \times 1) = +(8 - 3) = 5$
The cofactor matrix $C$ is:
$C = \begin{pmatrix} 0 & 15 & -10 \\ -5 & 0 & 10 \\ 5 & -10 & 5 \end{pmatrix}$
The adjoint of $A$, $\text{adj}(A)$, is the transpose of the cofactor matrix:
$\text{adj}(A) = C^T = \begin{pmatrix} 0 & -5 & 5 \\ 15 & 0 & -10 \\ -10 & 10 & 5 \end{pmatrix}$
The inverse of $A$ is $A^{-1} = \frac{1}{\det(A)} \text{adj}(A)$.
$A^{-1} = \frac{1}{25} \begin{pmatrix} 0 & -5 & 5 \\ 15 & 0 & -10 \\ -10 & 10 & 5 \end{pmatrix}$
Now, we can find $X$ using $X = A^{-1}B$:
$X = \frac{1}{25} \begin{pmatrix} 0 & -5 & 5 \\ 15 & 0 & -10 \\ -10 & 10 & 5 \end{pmatrix} \begin{pmatrix} 60 \\ 45 \\ 70 \end{pmatrix}$
Perform the matrix multiplication:
$X = \frac{1}{25} \begin{pmatrix} (0 \times 60) + (-5 \times 45) + (5 \times 70) \\ (15 \times 60) + (0 \times 45) + (-10 \times 70) \\ (-10 \times 60) + (10 \times 45) + (5 \times 70) \end{pmatrix}$
$X = \frac{1}{25} \begin{pmatrix} 0 - 225 + 350 \\ 900 + 0 - 700 \\ -600 + 450 + 350 \end{pmatrix}$
$X = \frac{1}{25} \begin{pmatrix} 125 \\ 200 \\ 200 \end{pmatrix}$
$X = \begin{pmatrix} 125/25 \\ 200/25 \\ 200/25 \end{pmatrix} = \begin{pmatrix} 5 \\ 8 \\ 8 \end{pmatrix}$
So, $x = 5$, $y = 8$, and $z = 8$.
Therefore, the cost of each item per kg is:
Cost of onion per kg = $\textsf{₹}5$
Cost of wheat per kg = $\textsf{₹}8$
Cost of rice per kg = $\textsf{₹}8$
Example 30 to 34 - Miscellaneous Examples
Example 30: If a, b, c are positive and unequal, show that value of the determinant
$∆ = \begin{vmatrix} a&b&c\\b&c&a\\c&a&b \end{vmatrix}$ is negative.
Answer:
Given:
The determinant $∆ = \begin{vmatrix} a&b&c\\b&c&a\\c&a&b \end{vmatrix}$, where $a, b, c$ are positive and unequal.
To Show:
The value of the determinant $∆$ is negative.
Solution:
Let's calculate the determinant $∆$:
$∆ = a \begin{vmatrix} c&a\\a&b \end{vmatrix} - b \begin{vmatrix} b&a\\c&b \end{vmatrix} + c \begin{vmatrix} b&c\\c&a \end{vmatrix}$
$∆ = a(cb - a \times a) - b(b \times b - a \times c) + c(b \times a - c \times c)$
$∆ = a(bc - a^2) - b(b^2 - ac) + c(ab - c^2)$
$∆ = abc - a^3 - b^3 + abc + abc - c^3$
Combine like terms:
$∆ = 3abc - a^3 - b^3 - c^3$
Rearrange the terms and factor out $-1$:
$∆ = -(a^3 + b^3 + c^3 - 3abc)$
We use the algebraic identity: $a^3 + b^3 + c^3 - 3abc = (a+b+c)(a^2+b^2+c^2 - ab - bc - ca)$.
So, $∆ = -(a+b+c)(a^2+b^2+c^2 - ab - bc - ca)$
Now, consider the expression $a^2+b^2+c^2 - ab - bc - ca$. We can rewrite it as:
$a^2+b^2+c^2 - ab - bc - ca = \frac{1}{2}(2a^2+2b^2+2c^2 - 2ab - 2bc - 2ca)$
$= \frac{1}{2}[(a^2 - 2ab + b^2) + (b^2 - 2bc + c^2) + (c^2 - 2ca + a^2)]$
$= \frac{1}{2}[(a-b)^2 + (b-c)^2 + (c-a)^2]$
Substituting this back into the expression for $∆$:
$∆ = -(a+b+c) \frac{1}{2}[(a-b)^2 + (b-c)^2 + (c-a)^2]$
We are given that $a, b, c$ are positive. Therefore, $(a+b+c)$ is positive.
We are also given that $a, b, c$ are unequal. This means that at least two of $a-b$, $b-c$, or $c-a$ are non-zero. Consequently, $(a-b)^2$, $(b-c)^2$, and $(c-a)^2$ are non-negative, and at least one of them must be strictly positive (since $a, b, c$ are unequal, the case where all differences are zero, i.e., $a=b=c$, is excluded).
Therefore, $(a-b)^2 + (b-c)^2 + (c-a)^2 > 0$.
This implies that $\frac{1}{2}[(a-b)^2 + (b-c)^2 + (c-a)^2]$ is positive.
So, $∆ = -( \text{positive term} ) \times ( \text{positive term} )$
$∆ = -(\text{a positive value})$
Thus, the value of the determinant $∆$ is negative.
Example 31: If a, b, c, are in A.P, find value of
$\begin{vmatrix} 2y+4&5y+7&8y+a\\3y+5&6y+8&9y+b\\4y+6&7y+9&10y+c \end{vmatrix}$
Answer:
Given:
The determinant $∆ = \begin{vmatrix} 2y+4&5y+7&8y+a\\3y+5&6y+8&9y+b\\4y+6&7y+9&10y+c \end{vmatrix}$.
$a, b, c$ are in A.P.
To Find:
The value of the determinant.
Solution:
Let the given determinant be $D$.
$D = \begin{vmatrix} 2y+4&5y+7&8y+a\\3y+5&6y+8&9y+b\\4y+6&7y+9&10y+c \end{vmatrix}$
Since $a, b, c$ are in A.P., the difference between consecutive terms is constant. That is, $b-a = c-b$. Let this common difference be $d$. So, $b-a = d$ and $c-b = d$.
We will use elementary row operations to simplify the determinant without changing its value.
Apply the operation $R_2 \to R_2 - R_1$:
The elements of the new second row are:
$(3y+5) - (2y+4) = y+1$
$(6y+8) - (5y+7) = y+1$
$(9y+b) - (8y+a) = y + (b-a)$
The determinant becomes:
$D = \begin{vmatrix} 2y+4 & 5y+7 & 8y+a \\ y+1 & y+1 & y+(b-a) \\ 4y+6 & 7y+9 & 10y+c \end{vmatrix}$
Now, apply the operation $R_3 \to R_3 - R_2$ (using the elements of the determinant after the previous operation):
The elements of the new third row are:
$(4y+6) - (3y+5) = y+1$
$(7y+9) - (6y+8) = y+1$
$(10y+c) - (9y+b) = y + (c-b)$
The determinant becomes:
$D = \begin{vmatrix} 2y+4 & 5y+7 & 8y+a \\ y+1 & y+1 & y+(b-a) \\ y+1 & y+1 & y+(c-b) \end{vmatrix}$
Now, substitute the A.P. property $b-a = c-b$. Let $d = b-a = c-b$.
$D = \begin{vmatrix} 2y+4 & 5y+7 & 8y+a \\ y+1 & y+1 & y+d \\ y+1 & y+1 & y+d \end{vmatrix}$
Observe the second and third rows of the determinant:
Row 2: $(y+1, y+1, y+d)$
Row 3: $(y+1, y+1, y+d)$
Since the second and third rows of the determinant are identical ($R_2 = R_3$), the value of the determinant is 0.
Therefore, $D = 0$.
The value of the determinant is $\mathbf{0}$.
Example 32: Show that
$∆ = \begin{vmatrix} (y+z)^2&xy&zx\\xy&(x+z)^2&yz\\xz&yz&(x+y)^2 \end{vmatrix} = 2xyz (x + y + z^)3$
Answer:
Given:
The determinant $∆ = \begin{vmatrix} (y+z)^2&xy&zx\\xy&(x+z)^2&yz\\xz&yz&(x+y)^2 \end{vmatrix}$.
To Show:
$∆ = 2xyz (x + y + z)^3$.
Solution:
Let the given determinant be $∆$.
$∆ = \begin{vmatrix} (y+z)^2&xy&zx\\xy&(x+z)^2&yz\\xz&yz&(x+y)^2 \end{vmatrix}$
Multiply $R_1$ by $x$, $R_2$ by $y$, $R_3$ by $z$. To keep the value of the determinant unchanged, we must divide by $xyz$.
$∆ = \frac{1}{xyz} \begin{vmatrix} x(y+z)^2&x^2y&zx^2\\xy^2&y(x+z)^2&y^2z\\xz^2&yz^2&z(x+y)^2 \end{vmatrix}$
Now, take out common factors $x$ from $C_1$, $y$ from $C_2$, and $z$ from $C_3$.
$∆ = \frac{xyz}{xyz} \begin{vmatrix} (y+z)^2&x^2&x^2\\y^2&(x+z)^2&y^2\\z^2&z^2&(x+y)^2 \end{vmatrix}$
$∆ = \begin{vmatrix} (y+z)^2&x^2&x^2\\y^2&(x+z)^2&y^2\\z^2&z^2&(x+y)^2 \end{vmatrix}$
Let this determinant be denoted by $D$. So, $∆ = D$. We will now evaluate $D$.
Finding factors of D:
Factor $x$: If we set $x=0$, the determinant becomes:
$D = \begin{vmatrix} (y+z)^2&0&0\\y^2&z^2&y^2\\z^2&z^2&y^2 \end{vmatrix} = (y+z)^2 (z^2y^2 - y^2z^2) = (y+z)^2 (0) = 0$.
Since $D=0$ when $x=0$, $x$ is a factor of $D$. By symmetry, $y$ and $z$ are also factors of $D$. Therefore, $xyz$ is a factor of $D$.
Factor $(x+y+z)$: If we set $x+y+z=0$, then $y+z=-x$, $x+z=-y$, $x+y=-z$.
$D = \begin{vmatrix} (-x)^2&x^2&x^2\\y^2&(-y)^2&y^2\\z^2&z^2&(-z)^2 \end{vmatrix} = \begin{vmatrix} x^2&x^2&x^2\\y^2&y^2&y^2\\z^2&z^2&z^2 \end{vmatrix}$
Since all three columns are identical, the value of the determinant is $0$. Thus, $(x+y+z)$ is a factor of $D$.
Factor $(x+y+z)^2$: Apply column operations $C_1 \to C_1 - C_3$ and $C_2 \to C_2 - C_3$ on $D$.
$D = \begin{vmatrix} (y+z)^2 - x^2 & x^2 - x^2 & x^2 \\ y^2 - y^2 & (x+z)^2 - y^2 & y^2 \\ z^2 - (x+y)^2 & z^2 - (x+y)^2 & (x+y)^2 \end{vmatrix}$
$D = \begin{vmatrix} (y+z-x)(y+z+x) & 0 & x^2 \\ 0 & (x+z-y)(x+z+y) & y^2 \\ (z-(x+y))(z+(x+y)) & (z-(x+y))(z+(x+y)) & (x+y)^2 \end{vmatrix}$
$D = \begin{vmatrix} (y+z-x)(x+y+z) & 0 & x^2 \\ 0 & (x+z-y)(x+y+z) & y^2 \\ -(x+y-z)(x+y+z) & -(x+y-z)(x+y+z) & (x+y)^2 \end{vmatrix}$
Take $(x+y+z)$ common from $C_1$ and $C_2$.
$D = (x+y+z)^2 \begin{vmatrix} y+z-x & 0 & x^2 \\ 0 & x+z-y & y^2 \\ -(x+y-z) & -(x+y-z) & (x+y)^2 \end{vmatrix}$
This shows that $(x+y+z)^2$ is a factor of $D$. Let $D'$ be the remaining determinant:
$D' = \begin{vmatrix} y+z-x & 0 & x^2 \\ 0 & x+z-y & y^2 \\ -(x+y-z) & -(x+y-z) & (x+y)^2 \end{vmatrix}$
Factor $(x+y+z)^3$: We need to check if $(x+y+z)$ is a factor of $D'$.
Set $x+y+z=0$. Then $y+z=-x$, $x+z=-y$, $x+y=-z$.
Also, $y+z-x = -x-x = -2x$.
$x+z-y = -y-y = -2y$.
$x+y-z = -z-z = -2z$.
Substitute these into $D'$:
$D' = \begin{vmatrix} -2x & 0 & x^2 \\ 0 & -2y & y^2 \\ -(-2z) & -(-2z) & (-z)^2 \end{vmatrix} = \begin{vmatrix} -2x & 0 & x^2 \\ 0 & -2y & y^2 \\ 2z & 2z & z^2 \end{vmatrix}$
Expand the determinant $D'$:
$D' = -2x \begin{vmatrix} -2y & y^2 \\ 2z & z^2 \end{vmatrix} - 0 + x^2 \begin{vmatrix} 0 & -2y \\ 2z & 2z \end{vmatrix}$
$D' = -2x ((-2y)(z^2) - (y^2)(2z)) + x^2 (0 - (-2y)(2z))$
$D' = -2x (-2yz^2 - 2y^2z) + x^2 (4yz)$
$D' = -2x (-2yz(z+y)) + 4x^2yz$
Since $x+y+z=0$, we have $z+y = -x$.
$D' = -2x (-2yz(-x)) + 4x^2yz$
$D' = -2x (2xyz) + 4x^2yz$
$D' = -4x^2yz + 4x^2yz = 0$.
Since $D'=0$ when $x+y+z=0$, $(x+y+z)$ is a factor of $D'$.
Therefore, $(x+y+z)^2 \times (x+y+z) = (x+y+z)^3$ is a factor of $D$.
Combining Factors:
We have shown that $x, y, z,$ and $(x+y+z)^3$ are factors of $D$. The degree of $D$ is 6 (e.g., the term $(y+z)^2(x+z)^2(x+y)^2$ has degree 6). The degree of the combined factor $xyz(x+y+z)^3$ is $1+1+1+3=6$.
Thus, $D$ must be a constant multiple of $xyz(x+y+z)^3$.
$D = k \cdot xyz (x+y+z)^3$ for some constant $k$.
Finding the constant k:
To find $k$, we substitute specific non-zero values for $x, y, z$. Let $x=1, y=1, z=1$.
$D = \begin{vmatrix} (1+1)^2&1^2&1^2\\1^2&(1+1)^2&1^2\\1^2&1^2&(1+1)^2 \end{vmatrix} = \begin{vmatrix} 4&1&1\\1&4&1\\1&1&4 \end{vmatrix}$
$D = 4(4 \times 4 - 1 \times 1) - 1(1 \times 4 - 1 \times 1) + 1(1 \times 1 - 4 \times 1)$
$D = 4(16 - 1) - 1(4 - 1) + 1(1 - 4)$
$D = 4(15) - 1(3) + 1(-3) = 60 - 3 - 3 = 54$.
Now substitute $x=1, y=1, z=1$ into the factored form:
$D = k \cdot (1)(1)(1) (1+1+1)^3 = k \cdot 1 \cdot (3)^3 = k \cdot 27$.
Equating the two values of $D$:
$27k = 54$
$k = \frac{54}{27} = 2$.
So, $D = 2xyz(x+y+z)^3$.
Conclusion:
Since $∆ = D$, we have shown that:
$∆ = 2xyz (x+y+z)^3$.
Hence Proved.
Example 33: Use product $\begin{bmatrix}1&1&2\\0&2&3\\3&2&4 \end{bmatrix} \begin{bmatrix}2&0&1\\9&2&3\\6&1&2 \end{bmatrix}$ to solve the system of equations
x – y + 2z = 1
2y – 3z = 1
3x – 2y + 4z = 2
Answer:
Given:
The matrix product: $P = \begin{bmatrix} 1 & -1 & 2 \\ 0 & 2 & -3 \\ 3 & -2 & 4 \end{bmatrix} \begin{bmatrix} -2 & 0 & 1 \\ 9 & 2 & -3 \\ 6 & 1 & -2 \end{bmatrix}$
The system of linear equations:
$x – y + 2z = 1$
$0x + 2y – 3z = 1$
$3x – 2y + 4z = 2$
To Find:
The solution (values of $x, y, z$) for the given system of equations using the given matrix product.
Solution:
First, let's calculate the given matrix product.
Let $A = \begin{bmatrix} 1 & -1 & 2 \\ 0 & 2 & -3 \\ 3 & -2 & 4 \end{bmatrix}$ and $B = \begin{bmatrix} -2 & 0 & 1 \\ 9 & 2 & -3 \\ 6 & 1 & -2 \end{bmatrix}$.
Product $AB = \begin{bmatrix} 1 & -1 & 2 \\ 0 & 2 & -3 \\ 3 & -2 & 4 \end{bmatrix} \begin{bmatrix} -2 & 0 & 1 \\ 9 & 2 & -3 \\ 6 & 1 & -2 \end{bmatrix}$
$AB = \begin{bmatrix} (1)(-2)+(-1)(9)+(2)(6) & (1)(0)+(-1)(2)+(2)(1) & (1)(1)+(-1)(-3)+(2)(-2) \\ (0)(-2)+(2)(9)+(-3)(6) & (0)(0)+(2)(2)+(-3)(1) & (0)(1)+(2)(-3)+(-3)(-2) \\ (3)(-2)+(-2)(9)+(4)(6) & (3)(0)+(-2)(2)+(4)(1) & (3)(1)+(-2)(-3)+(4)(-2) \end{bmatrix}$
$AB = \begin{bmatrix} -2-9+12 & 0-2+2 & 1+3-4 \\ 0+18-18 & 0+4-3 & 0-6+6 \\ -6-18+24 & 0-4+4 & 3+6-8 \end{bmatrix}$
$AB = \begin{bmatrix} 1 & 0 & 0 \\ 0 & 1 & 0 \\ 0 & 0 & 1 \end{bmatrix} = I$
So, the product of the given matrices is the identity matrix $I$.
Now, let's represent the given system of equations in matrix form $CX = D$.
$x – y + 2z = 1$
$0x + 2y – 3z = 1$
$3x – 2y + 4z = 2$
The coefficient matrix is $C = \begin{bmatrix} 1 & -1 & 2 \\ 0 & 2 & -3 \\ 3 & -2 & 4 \end{bmatrix}$.
The variable matrix is $X = \begin{bmatrix} x \\ y \\ z \end{bmatrix}$.
The constant matrix is $D = \begin{bmatrix} 1 \\ 1 \\ 2 \end{bmatrix}$.
The system is $CX = D$.
We observe that the coefficient matrix $C$ is the same as the first matrix $A$ in the given product.
$C = A = \begin{bmatrix} 1 & -1 & 2 \\ 0 & 2 & -3 \\ 3 & -2 & 4 \end{bmatrix}$
From the product calculation, we found that $AB = I$. Since $C=A$, we have $CB = I$.
By the definition of the inverse of a matrix, if $CB = I$, then $B$ is the inverse of $C$.
So, $C^{-1} = B = \begin{bmatrix} -2 & 0 & 1 \\ 9 & 2 & -3 \\ 6 & 1 & -2 \end{bmatrix}$.
To solve the system $CX = D$, we multiply both sides by $C^{-1}$ on the left:
$C^{-1}(CX) = C^{-1}D$
$(C^{-1}C)X = C^{-1}D$
$IX = C^{-1}D$
$X = C^{-1}D$
Now, substitute the matrices for $C^{-1}$ and $D$:
$X = \begin{bmatrix} -2 & 0 & 1 \\ 9 & 2 & -3 \\ 6 & 1 & -2 \end{bmatrix} \begin{bmatrix} 1 \\ 1 \\ 2 \end{bmatrix}$
Perform the matrix multiplication:
$X = \begin{bmatrix} (-2)(1)+(0)(1)+(1)(2) \\ (9)(1)+(2)(1)+(-3)(2) \\ (6)(1)+(1)(1)+(-2)(2) \end{bmatrix}$
$X = \begin{bmatrix} -2+0+2 \\ 9+2-6 \\ 6+1-4 \end{bmatrix}$
$X = \begin{bmatrix} 0 \\ 5 \\ 3 \end{bmatrix}$
Since $X = \begin{bmatrix} x \\ y \\ z \end{bmatrix}$, we have:
$x = 0, y = 5, z = 3$
Therefore, the solution to the given system of equations is $x=0$, $y=5$, and $z=3$.
Example 34: Prove that
$∆ = \begin{vmatrix} a+bx&c+dx&p+qx\\ax+b&cx+d&px+q\\u&v&w \end{vmatrix}$ = $(1 - x^2) \begin{vmatrix} a&c&p\\b&d&q\\u&v&w \end{vmatrix}$
Answer:
Given:
The determinant $∆ = \begin{vmatrix} a+bx&c+dx&p+qx\\ax+b&cx+d&px+q\\u&v&w \end{vmatrix}$.
To Prove:
$∆ = (1 - x^2) \begin{vmatrix} a&c&p\\b&d&q\\u&v&w \end{vmatrix}$.
Proof:
We start with the given determinant:
$∆ = \begin{vmatrix} a+bx&c+dx&p+qx\\ax+b&cx+d&px+q\\u&v&w \end{vmatrix}$
Apply the row operation $R_1 \to R_1 - xR_2$.
The elements of the new $R_1$ will be:
$(a+bx) - x(ax+b) = a+bx-ax^2-bx = a - ax^2 = a(1-x^2)$
$(c+dx) - x(cx+d) = c+dx-cx^2-dx = c - cx^2 = c(1-x^2)$
$(p+qx) - x(px+q) = p+qx-px^2-qx = p - px^2 = p(1-x^2)$
So the determinant becomes:
$∆ = \begin{vmatrix} a(1-x^2)&c(1-x^2)&p(1-x^2)\\ax+b&cx+d&px+q\\u&v&w \end{vmatrix}$
Take the common factor $(1-x^2)$ out from the first row ($R_1$).
$∆ = (1 - x^2) \begin{vmatrix} a&c&p\\ax+b&cx+d&px+q\\u&v&w \end{vmatrix}$
Now, apply the row operation $R_2 \to R_2 - xR_1$ on the determinant inside the bracket.
The elements of the new $R_2$ will be:
$(ax+b) - x(a) = ax+b-ax = b$
$(cx+d) - x(c) = cx+d-cx = d$
$(px+q) - x(p) = px+q-px = q$
So the determinant inside the bracket becomes:
$\begin{vmatrix} a&c&p\\b&d&q\\u&v&w \end{vmatrix}$
Substituting this back, we get:
$∆ = (1 - x^2) \begin{vmatrix} a&c&p\\b&d&q\\u&v&w \end{vmatrix}$
Hence Proved.
Miscellaneous Exercises on Chapter 4
Question 1. Prove that the determinant $\begin{vmatrix} x&\sinθ&\cosθ\\−\sinθ&−x&1\\\cosθ&1&x \end{vmatrix}$ is independent of θ.
Answer:
Let the given determinant be denoted by $D$.
Given: The determinant $D = \begin{vmatrix} x&\sinθ&\cosθ\\−\sinθ&−x&1\\\cosθ&1&x \end{vmatrix}$.
To Prove: The determinant $D$ is independent of $\theta$.
Solution:
We will expand the determinant along the first row ($R_1$).
$D = x \begin{vmatrix} -x & 1 \\ 1 & x \end{vmatrix} - \sin\theta \begin{vmatrix} -\sin\theta & 1 \\ \cos\theta & x \end{vmatrix} + \cos\theta \begin{vmatrix} -\sin\theta & -x \\ \cos\theta & 1 \end{vmatrix}$
Now, we evaluate the $2 \times 2$ determinants:
$D = x((-x)(x) - (1)(1)) - \sin\theta((-\sin\theta)(x) - (1)(\cos\theta)) + \cos\theta((-\sin\theta)(1) - (-x)(\cos\theta))$
Simplify the expression:
$D = x(-x^2 - 1) - \sin\theta(-x\sin\theta - \cos\theta) + \cos\theta(-\sin\theta + x\cos\theta)$
Distribute the terms:
$D = -x^3 - x + (- \sin\theta)(-x\sin\theta) + (-\sin\theta)(-\cos\theta) + (\cos\theta)(-\sin\theta) + (\cos\theta)(x\cos\theta)$
$D = -x^3 - x + x\sin^2\theta + \sin\theta\cos\theta - \sin\theta\cos\theta + x\cos^2\theta$
Cancel the $\sin\theta\cos\theta$ and $-\sin\theta\cos\theta$ terms:
$D = -x^3 - x + x\sin^2\theta + x\cos^2\theta$
Factor out $x$ from the terms involving $\theta$:
$D = -x^3 - x + x(\sin^2\theta + \cos^2\theta)$
Using the fundamental trigonometric identity $\sin^2\theta + \cos^2\theta = 1$:
$D = -x^3 - x + x(1)$
$D = -x^3 - x + x$
Simplify further:
$D = -x^3$
The value of the determinant is $-x^3$, which does not contain the variable $\theta$.
Therefore, the determinant is independent of $\theta$.
Hence, Proved.
Question 2. Without expanding the determinant, prove that $\begin{vmatrix} a&a^2&bc\\b&b^2&ca\\c&c^2&ab \end{vmatrix} = \begin{vmatrix} 1&a^2&a^3\\1&b^2&b^3\\1&c^2&c^3 \end{vmatrix}$ .
Answer:
Given:
Two determinants: $D_1 = \begin{vmatrix} a&a^2&bc\\b&b^2&ca\\c&c^2&ab \end{vmatrix}$ and $D_2 = \begin{vmatrix} 1&a^2&a^3\\1&b^2&b^3\\1&c^2&c^3 \end{vmatrix}$.
To Prove:
$D_1 = D_2$ without expanding the determinants.
Proof:
Consider the Left Hand Side determinant $D_1 = \begin{vmatrix} a&a^2&bc\\b&b^2&ca\\c&c^2&ab \end{vmatrix}$.
Multiply $R_1$ by $a$, $R_2$ by $b$, and $R_3$ by $c$. When a row of a determinant is multiplied by a scalar, the value of the determinant is multiplied by the same scalar. Thus, the new determinant is $abc$ times the original determinant $D_1$.
$abc \cdot D_1 = \begin{vmatrix} a \cdot a & a \cdot a^2 & a \cdot bc \\ b \cdot b & b \cdot b^2 & b \cdot ca \\ c \cdot c & c \cdot c^2 & c \cdot ab \end{vmatrix} = \begin{vmatrix} a^2 & a^3 & abc \\ b^2 & b^3 & abc \\ c^2 & c^3 & abc \end{vmatrix}$.
Now, observe the third column ($C_3$) of the resulting determinant $\begin{vmatrix} a^2 & a^3 & abc \\ b^2 & b^3 & abc \\ c^2 & c^3 & abc \end{vmatrix}$. The element $abc$ is common to all entries in this column.
We can take $abc$ common from $C_3$. When a common factor is taken out from a column (or row), the determinant is divided by that factor.
So, $\begin{vmatrix} a^2 & a^3 & abc \\ b^2 & b^3 & abc \\ c^2 & c^3 & abc \end{vmatrix} = abc \begin{vmatrix} a^2 & a^3 & 1 \\ b^2 & b^3 & 1 \\ c^2 & c^3 & 1 \end{vmatrix}$.
Combining the steps, we have $abc \cdot D_1 = abc \cdot \begin{vmatrix} a^2 & a^3 & 1 \\ b^2 & b^3 & 1 \\ c^2 & c^3 & 1 \end{vmatrix}$.
If $abc \neq 0$, we can divide both sides by $abc$, giving $D_1 = \begin{vmatrix} a^2 & a^3 & 1 \\ b^2 & b^3 & 1 \\ c^2 & c^3 & 1 \end{vmatrix}$. If $abc = 0$, the equality holds as shown by direct expansion (if $a=0$, both determinants equal $b^2c^2(c-b)$ etc.).
Thus, $D_1 = \begin{vmatrix} a^2 & a^3 & 1 \\ b^2 & b^3 & 1 \\ c^2 & c^3 & 1 \end{vmatrix}$.
Let's call the resulting determinant $D' = \begin{vmatrix} a^2 & a^3 & 1 \\ b^2 & b^3 & 1 \\ c^2 & c^3 & 1 \end{vmatrix}$.
We need to transform $D'$ into $D_2 = \begin{vmatrix} 1&a^2&a^3\\1&b^2&b^3\\1&c^2&c^3 \end{vmatrix}$ using column operations.
The columns in $D'$ are $C_1 = \begin{pmatrix} a^2 \\ b^2 \\ c^2 \end{pmatrix}$, $C_2 = \begin{pmatrix} a^3 \\ b^3 \\ c^3 \end{pmatrix}$, $C_3 = \begin{pmatrix} 1 \\ 1 \\ 1 \end{pmatrix}$.
The columns in $D_2$ are $C'_1 = \begin{pmatrix} 1 \\ 1 \\ 1 \end{pmatrix}$, $C'_2 = \begin{pmatrix} a^2 \\ b^2 \\ c^2 \end{pmatrix}$, $C'_3 = \begin{pmatrix} a^3 \\ b^3 \\ c^3 \end{pmatrix}$.
We need to rearrange the columns of $D'$ from the order $(C_1, C_2, C_3)$ to $(C_3, C_1, C_2)$.
Perform the column swap $C_1 \leftrightarrow C_3$ on $D'$. Swapping two columns of a determinant multiplies its value by $-1$.
$D' = - \begin{vmatrix} 1 & a^3 & a^2 \\ 1 & b^3 & b^2 \\ 1 & c^3 & c^2 \end{vmatrix}$.
Now, perform the column swap $C_2 \leftrightarrow C_3$ on the new determinant $\begin{vmatrix} 1 & a^3 & a^2 \\ 1 & b^3 & b^2 \\ 1 & c^3 & c^2 \end{vmatrix}$. This again multiplies the value by $-1$.
$- \begin{vmatrix} 1 & a^3 & a^2 \\ 1 & b^3 & b^2 \\ 1 & c^3 & c^2 \end{vmatrix} = - (-1) \begin{vmatrix} 1 & a^2 & a^3 \\ 1 & b^2 & b^3 \\ 1 & c^2 & c^3 \end{vmatrix} = \begin{vmatrix} 1 & a^2 & a^3 \\ 1 & b^2 & b^3 \\ 1 & c^2 & c^3 \end{vmatrix}$.
The resulting determinant is $\begin{vmatrix} 1&a^2&a^3\\1&b^2&b^3\\1&c^2&c^3 \end{vmatrix}$, which is exactly the Right Hand Side determinant $D_2$.
Thus, we have shown that $D_1 = D' = D_2$.
Therefore, $\begin{vmatrix} a&a^2&bc\\b&b^2&ca\\c&c^2&ab \end{vmatrix} = \begin{vmatrix} 1&a^2&a^3\\1&b^2&b^3\\1&c^2&c^3 \end{vmatrix}$.
Hence, Proved.
Question 3. Evaluate $\begin{vmatrix} \cosα \cosβ &\cosα \sinβ &−\sinα\\−\sinβ &\cosβ &0\\\sinα \cosβ &\sinα \sinβ &\cosα \end{vmatrix}$
Answer:
Let the given determinant be $D$.
$D = \begin{vmatrix} \cosα \cosβ & \cosα \sinβ & −\sinα\\−\sinβ & \cosβ & 0\\\sinα \cosβ & \sinα \sinβ & \cosα \end{vmatrix}$
Solution:
We will evaluate the determinant by expanding along the second row ($R_2$), as it contains a zero element, simplifying the calculation.
The expansion along $R_2$ is given by:
$D = (-1)^{2+1}(-\sin\beta) \begin{vmatrix} \cos\alpha \sin\beta & -\sin\alpha \\ \sin\alpha \sin\beta & \cos\alpha \end{vmatrix} + (-1)^{2+2}(\cos\beta) \begin{vmatrix} \cos\alpha \cos\beta & -\sin\alpha \\ \sin\alpha \cosβ & \cos\alpha \end{vmatrix} + (-1)^{2+3}(0) \begin{vmatrix} \cos\alpha \cosβ & \cos\alpha \sinβ \\ \sin\alpha \cosβ & \sin\alpha \sinβ \end{vmatrix}$
$D = \sin\beta \left( (\cos\alpha \sin\beta)(\cos\alpha) - (-\sin\alpha)(\sin\alpha \sin\beta) \right) + \cos\beta \left( (\cos\alpha \cos\beta)(\cos\alpha) - (-\sin\alpha)(\sin\alpha \cos\beta) \right) + 0$
$D = \sin\beta \left( \cos^2\alpha \sin\beta + \sin^2\alpha \sin\beta \right) + \cos\beta \left( \cos^2\alpha \cos\beta + \sin^2\alpha \cos\beta \right)$
Factor out the common terms from the parentheses:
$D = \sin\beta \left( \sin\beta (\cos^2\alpha + \sin^2\alpha) \right) + \cos\beta \left( \cos\beta (\cos^2\alpha + \sin^2\alpha) \right)$
Using the fundamental trigonometric identity $\sin^2\alpha + \cos^2\alpha = 1$:
$D = \sin\beta \left( \sin\beta (1) \right) + \cos\beta \left( \cos\beta (1) \right)$
$D = \sin^2\beta + \cos^2\beta$
Using the fundamental trigonometric identity $\sin^2\beta + \cos^2\beta = 1$:
$D = 1$
The value of the determinant is $\mathbf{1}$.
Question 4. If a, b and c are real numbers, and
$∆ = \begin{vmatrix} b+c&c+a&a+b\\c+a&a+b&b+c\\a+b&b+c&c+a \end{vmatrix} = 0$ ,
Show that either a + b + c = 0 or a = b = c.
Answer:
Given:
The determinant $\Delta = \begin{vmatrix} b+c&c+a&a+b\\c+a&a+b&b+c\\a+b&b+c&c+a \end{vmatrix} = 0$, where $a, b, c$ are real numbers.
To Show:
Either $a+b+c=0$ or $a=b=c$.
Solution:
Consider the given determinant:
$\Delta = \begin{vmatrix} b+c&c+a&a+b\\c+a&a+b&b+c\\a+b&b+c&c+a \end{vmatrix}$
Apply the row operation $R_1 \to R_1 + R_2 + R_3$:
$\Delta = \begin{vmatrix} (b+c)+(c+a)+(a+b)&(c+a)+(a+b)+(b+c)&(a+b)+(b+c)+(c+a)\\c+a&a+b&b+c\\a+b&b+c&c+a \end{vmatrix}$
$\Delta = \begin{vmatrix} 2(a+b+c)&2(a+b+c)&2(a+b+c)\\c+a&a+b&b+c\\a+b&b+c&c+a \end{vmatrix}$
Take out the common factor $2(a+b+c)$ from $R_1$:
$\Delta = 2(a+b+c) \begin{vmatrix} 1&1&1\\c+a&a+b&b+c\\a+b&b+c&c+a \end{vmatrix}$
Apply the column operations $C_2 \to C_2 - C_1$ and $C_3 \to C_3 - C_1$:
$\Delta = 2(a+b+c) \begin{vmatrix} 1&1-1&1-1\\c+a&(a+b)-(c+a)&(b+c)-(c+a)\\a+b&(b+c)-(a+b)&(c+a)-(a+b) \end{vmatrix}$
$\Delta = 2(a+b+c) \begin{vmatrix} 1&0&0\\c+a&b-c&b-a\\a+b&c-a&c-b \end{vmatrix}$
Expand the determinant along the first row ($R_1$):
$\Delta = 2(a+b+c) \left[ 1 \cdot \begin{vmatrix} b-c&b-a\\c-a&c-b \end{vmatrix} - 0 + 0 \right]$
$\Delta = 2(a+b+c) [ (b-c)(c-b) - (b-a)(c-a) ]$
$\Delta = 2(a+b+c) [ -(b-c)^2 - (bc - ab - ac + a^2) ]$
$\Delta = 2(a+b+c) [ -(b^2 - 2bc + c^2) - bc + ab + ac - a^2 ]$
$\Delta = 2(a+b+c) [ -b^2 + 2bc - c^2 - bc + ab + ac - a^2 ]$
$\Delta = 2(a+b+c) [ -a^2 - b^2 - c^2 + ab + bc + ac ]$
$\Delta = -2(a+b+c) [ a^2 + b^2 + c^2 - ab - bc - ac ]$
We are given that $\Delta = 0$. So,
$-2(a+b+c) [ a^2 + b^2 + c^2 - ab - bc - ac ] = 0$
Since $-2 \neq 0$, this equation holds if and only if either $(a+b+c) = 0$ or $(a^2 + b^2 + c^2 - ab - bc - ac) = 0$.
Case 1: $a+b+c = 0$.
This is one of the conditions we need to show.
Case 2: $a^2 + b^2 + c^2 - ab - bc - ac = 0$.
Multiply the equation by 2:
$2(a^2 + b^2 + c^2 - ab - bc - ac) = 0$
$2a^2 + 2b^2 + 2c^2 - 2ab - 2bc - 2ac = 0$
Rearrange the terms to form perfect squares:
$(a^2 - 2ab + b^2) + (b^2 - 2bc + c^2) + (c^2 - 2ac + a^2) = 0$
$(a-b)^2 + (b-c)^2 + (c-a)^2 = 0$
Since $a, b, c$ are real numbers, the squares of the differences $(a-b), (b-c), (c-a)$ are non-negative.
$(a-b)^2 \ge 0$, $(b-c)^2 \ge 0$, and $(c-a)^2 \ge 0$.
The sum of non-negative terms can be zero only if each individual term is zero.
So, $(a-b)^2 = 0$, $(b-c)^2 = 0$, and $(c-a)^2 = 0$.
This implies $a-b = 0$, $b-c = 0$, and $c-a = 0$.
From these equations, we get $a=b$, $b=c$, and $c=a$. Therefore, $a=b=c$.
Thus, if $\Delta = 0$, it implies that either $a+b+c = 0$ or $a=b=c$.
Hence, Shown.
Question 5. Solve the equation $\begin{vmatrix} x+a&x&x\\x&x+a&x\\x&x&x+a \end{vmatrix} = 0, \;a ≠ 0$
Answer:
Let the given determinant be $D$.
$D = \begin{vmatrix} x+a&x&x\\x&x+a&x\\x&x&x+a \end{vmatrix}$
We are given that $D = 0$ and $a \neq 0$.
Solution:
Apply the operation $R_1 \to R_1 + R_2 + R_3$ to the determinant:
$D = \begin{vmatrix} (x+a)+x+x&x+(x+a)+x&x+x+(x+a)\\x&x+a&x\\x&x&x+a \end{vmatrix}$
$D = \begin{vmatrix} 3x+a&3x+a&3x+a\\x&x+a&x\\x&x&x+a \end{vmatrix}$
Take out the common factor $(3x+a)$ from $R_1$:
$D = (3x+a) \begin{vmatrix} 1&1&1\\x&x+a&x\\x&x&x+a \end{vmatrix}$
Apply the operations $C_2 \to C_2 - C_1$ and $C_3 \to C_3 - C_1$:
$D = (3x+a) \begin{vmatrix} 1&1-1&1-1\\x&(x+a)-x&x-x\\x&x-x&(x+a)-x \end{vmatrix}$
$D = (3x+a) \begin{vmatrix} 1&0&0\\x&a&0\\x&0&a \end{vmatrix}$
Expand the determinant along the first row ($R_1$):
$D = (3x+a) \left[ 1 \cdot \begin{vmatrix} a&0\\0&a \end{vmatrix} - 0 \cdot \begin{vmatrix} x&0\\x&a \end{vmatrix} + 0 \cdot \begin{vmatrix} x&a\\x&0 \end{vmatrix} \right]$
$D = (3x+a) [ (a)(a) - (0)(0) ]$
$D = (3x+a)(a^2)$
We are given that $D = 0$.
So, $(3x+a)(a^2) = 0$.
Since $a \neq 0$, we have $a^2 \neq 0$.
For the product $(3x+a)(a^2)$ to be zero, the factor $(3x+a)$ must be zero.
$3x+a = 0$
$3x = -a$
$x = -\frac{a}{3}$
The solution to the equation is $\mathbf{x = -\frac{a}{3}}$.
Question 6. Prove that $\begin{vmatrix} a^2&bc&ac+c^2\\a^2+ab&b^2&ac\\ab&b^2+bc&c^2 \end{vmatrix} = 4a^2b^2c^2$
Answer:
To Prove:
$\begin{vmatrix} a^2&bc&ac+c^2\\a^2+ab&b^2&ac\\ab&b^2+bc&c^2 \end{vmatrix} = 4a^2b^2c^2$
Proof:
Consider the Left Hand Side (LHS) of the equation:
$LHS = \begin{vmatrix} a^2&bc&ac+c^2\\a^2+ab&b^2&ac\\ab&b^2+bc&c^2 \end{vmatrix}$
Take out the common factor $a$ from $C_1$, $b$ from $C_2$, and $c$ from $C_3$.
$LHS = abc \begin{vmatrix} a&c&a+c\\a+b&b&a\\b&b+c&c \end{vmatrix}$
Apply the column operation $C_1 \to C_1 + C_2 + C_3$:
$LHS = abc \begin{vmatrix} a+c+(a+c)&c&a+c\\(a+b)+b+a&b&a\\b+(b+c)+c&b+c&c \end{vmatrix}$
$LHS = abc \begin{vmatrix} 2a+2c&c&a+c\\2a+2b&b&a\\2b+2c&b+c&c \end{vmatrix}$
Take out the common factor $2$ from $C_1$:
$LHS = 2abc \begin{vmatrix} a+c&c&a+c\\a+b&b&a\\b+c&b+c&c \end{vmatrix}$
Apply the column operation $C_1 \to C_1 - C_3$:
$LHS = 2abc \begin{vmatrix} (a+c)-(a+c)&c&a+c\\(a+b)-a&b&a\\(b+c)-c&b+c&c \end{vmatrix}$
$LHS = 2abc \begin{vmatrix} 0&c&a+c\\b&b&a\\b&b+c&c \end{vmatrix}$
Expand the determinant along the first column ($C_1$). Note that the first element is 0.
$LHS = 2abc \left[ 0 \cdot \begin{vmatrix} b&a\\b+c&c \end{vmatrix} - b \cdot \begin{vmatrix} c&a+c\\b+c&c \end{vmatrix} + b \cdot \begin{vmatrix} c&a+c\\b&a \end{vmatrix} \right]$
$LHS = 2abc \left[ -b \left( c(c) - (a+c)(b+c) \right) + b \left( c(a) - (a+c)(b) \right) \right]$
$LHS = 2abc \left[ -b \left( c^2 - (ab + ac + bc + c^2) \right) + b \left( ac - (ab + bc) \right) \right]$
$LHS = 2abc \left[ -b \left( c^2 - ab - ac - bc - c^2 \right) + b \left( ac - ab - bc \right) \right]$
$LHS = 2abc \left[ -b \left( - ab - ac - bc \right) + b \left( ac - ab - bc \right) \right]$
Distribute $-b$ in the first term and $b$ in the second term inside the square brackets:
$LHS = 2abc \left[ (ab^2 + abc + b^2c) + (abc - ab^2 - b^2c) \right]$
Combine like terms inside the square brackets:
$LHS = 2abc \left[ ab^2 - ab^2 + abc + abc + b^2c - b^2c \right]$
$LHS = 2abc \left[ 2abc \right]$
$LHS = 4a^2b^2c^2$
This is equal to the Right Hand Side (RHS).
Therefore, $\begin{vmatrix} a^2&bc&ac+c^2\\a^2+ab&b^2&ac\\ab&b^2+bc&c^2 \end{vmatrix} = 4a^2b^2c^2$.
Hence, Proved.
Question 7. If A–1 = $\begin{vmatrix} 3&−1&1\\−15&6&−5\\5&−2&2 \end{vmatrix}$ and B = $\begin{vmatrix} 1&2&−2\\−1&3&0\\0&−2&1 \end{vmatrix}$ , find (AB)-1.
Answer:
Given:
$A^{-1} = \begin{pmatrix} 3 & -1 & 1 \\ -15 & 6 & -5 \\ 5 & -2 & 2 \end{pmatrix}$
$B = \begin{pmatrix} 1 & 2 & -2 \\ -1 & 3 & 0 \\ 0 & -2 & 1 \end{pmatrix}$
To Find:
$(AB)^{-1}$
Solution:
We know that for two invertible matrices A and B, the inverse of their product is given by the formula: $(AB)^{-1} = B^{-1}A^{-1}$.
We are given $A^{-1}$, so we need to find $B^{-1}$.
To find $B^{-1}$, we use the formula $B^{-1} = \frac{1}{|B|} \text{adj}(B)$.
First, calculate the determinant of B, $|B|$. We will expand along the first row ($R_1$):
$|B| = 1 \begin{vmatrix} 3 & 0 \\ -2 & 1 \end{vmatrix} - 2 \begin{vmatrix} -1 & 0 \\ 0 & 1 \end{vmatrix} + (-2) \begin{vmatrix} -1 & 3 \\ 0 & -2 \end{vmatrix}$
$|B| = 1((3)(1) - (0)(-2)) - 2((-1)(1) - (0)(0)) - 2((-1)(-2) - (3)(0))$
$|B| = 1(3 - 0) - 2(-1 - 0) - 2(2 - 0)$
$|B| = 3 - 2(-1) - 2(2)$
$|B| = 3 + 2 - 4$
$|B| = 1$
Since $|B| = 1 \neq 0$, $B$ is invertible, and $B^{-1}$ exists.
Next, find the adjoint of B, adj(B). First, find the cofactor matrix of B. Let $C_{ij}$ be the cofactor of the element in the $i$-th row and $j$-th column of B.
$C_{11} = + \begin{vmatrix} 3 & 0 \\ -2 & 1 \end{vmatrix} = (3)(1) - (0)(-2) = 3$
$C_{12} = - \begin{vmatrix} -1 & 0 \\ 0 & 1 \end{vmatrix} = -((-1)(1) - (0)(0)) = -(-1) = 1$
$C_{13} = + \begin{vmatrix} -1 & 3 \\ 0 & -2 \end{vmatrix} = (-1)(-2) - (3)(0) = 2 - 0 = 2$
$C_{21} = - \begin{vmatrix} 2 & -2 \\ -2 & 1 \end{vmatrix} = -((2)(1) - (-2)(-2)) = -(2 - 4) = -(-2) = 2$
$C_{22} = + \begin{vmatrix} 1 & -2 \\ 0 & 1 \end{vmatrix} = (1)(1) - (-2)(0) = 1 - 0 = 1$
$C_{23} = - \begin{vmatrix} 1 & 2 \\ 0 & -2 \end{vmatrix} = -((1)(-2) - (2)(0)) = -(-2 - 0) = -(-2) = 2$
$C_{31} = + \begin{vmatrix} 2 & -2 \\ 3 & 0 \end{vmatrix} = (2)(0) - (-2)(3) = 0 - (-6) = 6$
$C_{32} = - \begin{vmatrix} 1 & -2 \\ -1 & 0 \end{vmatrix} = -((1)(0) - (-2)(-1)) = -(0 - 2) = -(-2) = 2$
$C_{33} = + \begin{vmatrix} 1 & 2 \\ -1 & 3 \end{vmatrix} = (1)(3) - (2)(-1) = 3 - (-2) = 3 + 2 = 5$
The cofactor matrix of B is $\begin{pmatrix} 3 & 1 & 2 \\ 2 & 1 & 2 \\ 6 & 2 & 5 \end{pmatrix}$.
The adjoint of B is the transpose of the cofactor matrix:
$\text{adj}(B) = \begin{pmatrix} 3 & 1 & 2 \\ 2 & 1 & 2 \\ 6 & 2 & 5 \end{pmatrix}^T = \begin{pmatrix} 3 & 2 & 6 \\ 1 & 1 & 2 \\ 2 & 2 & 5 \end{pmatrix}$
Now, calculate $B^{-1}$:
$B^{-1} = \frac{1}{|B|} \text{adj}(B) = \frac{1}{1} \begin{pmatrix} 3 & 2 & 6 \\ 1 & 1 & 2 \\ 2 & 2 & 5 \end{pmatrix} = \begin{pmatrix} 3 & 2 & 6 \\ 1 & 1 & 2 \\ 2 & 2 & 5 \end{pmatrix}$
Finally, calculate $(AB)^{-1} = B^{-1}A^{-1}$:
$(AB)^{-1} = \begin{pmatrix} 3 & 2 & 6 \\ 1 & 1 & 2 \\ 2 & 2 & 5 \end{pmatrix} \begin{pmatrix} 3 & -1 & 1 \\ -15 & 6 & -5 \\ 5 & -2 & 2 \end{pmatrix}$
Perform matrix multiplication:
$(AB)^{-1} = \begin{pmatrix} (3)(3)+(2)(-15)+(6)(5) & (3)(-1)+(2)(6)+(6)(-2) & (3)(1)+(2)(-5)+(6)(2) \\ (1)(3)+(1)(-15)+(2)(5) & (1)(-1)+(1)(6)+(2)(-2) & (1)(1)+(1)(-5)+(2)(2) \\ (2)(3)+(2)(-15)+(5)(5) & (2)(-1)+(2)(6)+(5)(-2) & (2)(1)+(2)(-5)+(5)(2) \end{pmatrix}$
$(AB)^{-1} = \begin{pmatrix} 9-30+30 & -3+12-12 & 3-10+12 \\ 3-15+10 & -1+6-4 & 1-5+4 \\ 6-30+25 & -2+12-10 & 2-10+10 \end{pmatrix}$
$(AB)^{-1} = \begin{pmatrix} 9 & -3 & 5 \\ -2 & 1 & 0 \\ 1 & 0 & 2 \end{pmatrix}$
The value of $(AB)^{-1}$ is $\begin{pmatrix} 9 & -3 & 5 \\ -2 & 1 & 0 \\ 1 & 0 & 2 \end{pmatrix}$.
Question 8. Let A = $\begin{vmatrix} 1&2&1\\2&3&1\\1&1&5 \end{vmatrix}$ . Verify that
(i) [adj A]–1 = adj (A–1)
(ii) (A–1)–1 = A
Answer:
Given:
The matrix $A = \begin{pmatrix} 1 & 2 & 1 \\ 2 & 3 & 1 \\ 1 & 1 & 5 \end{pmatrix}$.
To Verify:
(i) $(\text{adj A})^{-1} = \text{adj}(A^{-1})$
(ii) $(A^{-1})^{-1} = A$
Verification of (ii): $(A^{-1})^{-1} = A$
First, we need to find the inverse of matrix A, $A^{-1}$. The formula for the inverse is $A^{-1} = \frac{1}{|A|} \text{adj}(A)$.
Calculate the determinant of A, $|A|$, by expanding along the first row ($R_1$):
$|A| = 1 \begin{vmatrix} 3 & 1 \\ 1 & 5 \end{vmatrix} - 2 \begin{vmatrix} 2 & 1 \\ 1 & 5 \end{vmatrix} + 1 \begin{vmatrix} 2 & 3 \\ 1 & 1 \end{vmatrix}$
$|A| = 1((3)(5) - (1)(1)) - 2((2)(5) - (1)(1)) + 1((2)(1) - (3)(1))$
$|A| = 1(15 - 1) - 2(10 - 1) + 1(2 - 3)$
$|A| = 14 - 18 - 1 = -5$
Since $|A| = -5 \neq 0$, the matrix A is invertible.
Next, find the adjoint of A, $\text{adj}(A)$, by finding the cofactor matrix and taking its transpose.
Cofactors of A:
$C_{11} = +\begin{vmatrix} 3&1\\1&5 \end{vmatrix} = 14$
$C_{12} = -\begin{vmatrix} 2&1\\1&5 \end{vmatrix} = -9$
$C_{13} = +\begin{vmatrix} 2&3\\1&1 \end{vmatrix} = -1$
$C_{21} = -\begin{vmatrix} 2&1\\1&5 \end{vmatrix} = -9$
$C_{22} = +\begin{vmatrix} 1&1\\1&5 \end{vmatrix} = 4$
$C_{23} = -\begin{vmatrix} 1&2\\1&1 \end{vmatrix} = 1$
$C_{31} = +\begin{vmatrix} 2&1\\3&1 \end{vmatrix} = -1$
$C_{32} = -\begin{vmatrix} 1&1\\2&1 \end{vmatrix} = 1$
$C_{33} = +\begin{vmatrix} 1&2\\2&3 \end{vmatrix} = -1$
The cofactor matrix of A is $\begin{pmatrix} 14 & -9 & -1 \\ -9 & 4 & 1 \\ -1 & 1 & -1 \end{pmatrix}$.
$\text{adj}(A) = \begin{pmatrix} 14 & -9 & -1 \\ -9 & 4 & 1 \\ -1 & 1 & -1 \end{pmatrix}^T = \begin{pmatrix} 14 & -9 & -1 \\ -9 & 4 & 1 \\ -1 & 1 & -1 \end{pmatrix}$.
Now, calculate $A^{-1}$:
$A^{-1} = \frac{1}{|A|} \text{adj}(A) = \frac{1}{-5} \begin{pmatrix} 14 & -9 & -1 \\ -9 & 4 & 1 \\ -1 & 1 & -1 \end{pmatrix} = \begin{pmatrix} -\frac{14}{5} & \frac{9}{5} & \frac{1}{5} \\ \frac{9}{5} & -\frac{4}{5} & -\frac{1}{5} \\ \frac{1}{5} & -\frac{1}{5} & \frac{1}{5} \end{pmatrix}$.
The property $(A^{-1})^{-1} = A$ states that the inverse of the inverse of a matrix is the matrix itself. This is a standard property of matrix inverses.
To verify this, we would need to calculate the inverse of $A^{-1}$ and check if it equals A. However, using the property directly is a valid verification in this context.
By the property of matrix inverses, $(A^{-1})^{-1} = A$.
We have $A = \begin{pmatrix} 1 & 2 & 1 \\ 2 & 3 & 1 \\ 1 & 1 & 5 \end{pmatrix}$.
Thus, $(A^{-1})^{-1} = \begin{pmatrix} 1 & 2 & 1 \\ 2 & 3 & 1 \\ 1 & 1 & 5 \end{pmatrix}$, which is indeed equal to A.
Property (ii) is Verified.
Verification of (i): $(\text{adj A})^{-1} = \text{adj}(A^{-1})$
We need to calculate the LHS, $(\text{adj A})^{-1}$, and the RHS, $\text{adj}(A^{-1})$, and check if they are equal.
Calculating LHS: $(\text{adj A})^{-1}$
We already found $\text{adj}(A) = \begin{pmatrix} 14 & -9 & -1 \\ -9 & 4 & 1 \\ -1 & 1 & -1 \end{pmatrix}$.
To find the inverse of adj(A), we can use the formula $(\text{adj A})^{-1} = \frac{1}{|\text{adj A}|} \text{adj}(\text{adj A})$.
We know that $|\text{adj A}| = |A|^{n-1}$. For a $3 \times 3$ matrix ($n=3$), $|\text{adj A}| = |A|^{3-1} = |A|^2$.
$|\text{adj A}| = (-5)^2 = 25$.
We also know that $\text{adj}(\text{adj A}) = |A|^{n-2} A$. For a $3 \times 3$ matrix ($n=3$), $\text{adj}(\text{adj A}) = |A|^{3-2} A = |A| A$.
So, $(\text{adj A})^{-1} = \frac{1}{|\text{adj A}|} \text{adj}(\text{adj A}) = \frac{1}{|A|^2} (|A| A) = \frac{|A|}{|A|^2} A = \frac{1}{|A|} A$.
Substitute the values of $|A|$ and A:
$(\text{adj A})^{-1} = \frac{1}{-5} \begin{pmatrix} 1 & 2 & 1 \\ 2 & 3 & 1 \\ 1 & 1 & 5 \end{pmatrix} = \begin{pmatrix} -\frac{1}{5} & -\frac{2}{5} & -\frac{1}{5} \\ -\frac{2}{5} & -\frac{3}{5} & -\frac{1}{5} \\ -\frac{1}{5} & -\frac{1}{5} & -\frac{5}{5} \end{pmatrix} = \begin{pmatrix} -\frac{1}{5} & -\frac{2}{5} & -\frac{1}{5} \\ -\frac{2}{5} & -\frac{3}{5} & -\frac{1}{5} \\ -\frac{1}{5} & -\frac{1}{5} & -1 \end{pmatrix}$.
This is the LHS.
Calculating RHS: $\text{adj}(A^{-1})$
We need to find the adjoint of $A^{-1}$. We already found $A^{-1} = \begin{pmatrix} -\frac{14}{5} & \frac{9}{5} & \frac{1}{5} \\ \frac{9}{5} & -\frac{4}{5} & -\frac{1}{5} \\ \frac{1}{5} & -\frac{1}{5} & \frac{1}{5} \end{pmatrix}$.
We find the cofactor matrix of $A^{-1}$ and then take its transpose.
Cofactors of $A^{-1}$:
$C'_{11} = +\begin{vmatrix} -4/5 & -1/5 \\ -1/5 & 1/5 \end{vmatrix} = (-\frac{4}{25}) - (\frac{1}{25}) = -\frac{5}{25} = -\frac{1}{5}$
$C'_{12} = -\begin{vmatrix} 9/5 & -1/5 \\ 1/5 & 1/5 \end{vmatrix} = -(\frac{9}{25} - (-\frac{1}{25})) = -(\frac{9}{25} + \frac{1}{25}) = -\frac{10}{25} = -\frac{2}{5}$
$C'_{13} = +\begin{vmatrix} 9/5 & -4/5 \\ 1/5 & -1/5 \end{vmatrix} = (-\frac{9}{25}) - (-\frac{4}{25}) = -\frac{9}{25} + \frac{4}{25} = -\frac{5}{25} = -\frac{1}{5}$
$C'_{21} = -\begin{vmatrix} 9/5 & 1/5 \\ -1/5 & 1/5 \end{vmatrix} = -(\frac{9}{25} - (-\frac{1}{25})) = -(\frac{9}{25} + \frac{1}{25}) = -\frac{10}{25} = -\frac{2}{5}$
$C'_{22} = +\begin{vmatrix} -14/5 & 1/5 \\ 1/5 & 1/5 \end{vmatrix} = (-\frac{14}{25}) - (\frac{1}{25}) = -\frac{15}{25} = -\frac{3}{5}$
$C'_{23} = -\begin{vmatrix} -14/5 & 9/5 \\ 1/5 & -1/5 \end{vmatrix} = -(\frac{14}{25} - \frac{9}{25}) = -(\frac{5}{25}) = -\frac{1}{5}$
$C'_{31} = +\begin{vmatrix} 9/5 & 1/5 \\ -4/5 & -1/5 \end{vmatrix} = (-\frac{9}{25}) - (-\frac{4}{25}) = -\frac{9}{25} + \frac{4}{25} = -\frac{5}{25} = -\frac{1}{5}$
$C'_{32} = -\begin{vmatrix} -14/5 & 1/5 \\ 9/5 & -1/5 \end{vmatrix} = -(\frac{14}{25} - \frac{9}{25}) = -(\frac{5}{25}) = -\frac{1}{5}$
$C'_{33} = +\begin{vmatrix} -14/5 & 9/5 \\ 9/5 & -4/5 \end{vmatrix} = (\frac{56}{25}) - (\frac{81}{25}) = -\frac{25}{25} = -1$
The cofactor matrix of $A^{-1}$ is $\begin{pmatrix} -1/5 & -2/5 & -1/5 \\ -2/5 & -3/5 & -1/5 \\ -1/5 & -1/5 & -1 \end{pmatrix}$.
$\text{adj}(A^{-1}) = \begin{pmatrix} -1/5 & -2/5 & -1/5 \\ -2/5 & -3/5 & -1/5 \\ -1/5 & -1/5 & -1 \end{pmatrix}^T = \begin{pmatrix} -1/5 & -2/5 & -1/5 \\ -2/5 & -3/5 & -1/5 \\ -1/5 & -1/5 & -1 \end{pmatrix}$.
This is the RHS.
Comparing the LHS and RHS:
$(\text{adj A})^{-1} = \begin{pmatrix} -\frac{1}{5} & -\frac{2}{5} & -\frac{1}{5} \\ -\frac{2}{5} & -\frac{3}{5} & -\frac{1}{5} \\ -\frac{1}{5} & -\frac{1}{5} & -1 \end{pmatrix}$ and $\text{adj}(A^{-1}) = \begin{pmatrix} -\frac{1}{5} & -\frac{2}{5} & -\frac{1}{5} \\ -\frac{2}{5} & -\frac{3}{5} & -\frac{1}{5} \\ -\frac{1}{5} & -\frac{1}{5} & -1 \end{pmatrix}$.
LHS = RHS.
Property (i) is Verified.
Question 9. Evaluate $\begin{vmatrix} x&y&x+y\\y&x+y&x\\x+y&x&y \end{vmatrix}$
Answer:
Let the given determinant be $D$.
$D = \begin{vmatrix} x&y&x+y\\y&x+y&x\\x+y&x&y \end{vmatrix}$
Solution:
Apply the row operation $R_1 \to R_1 + R_2 + R_3$:
$D = \begin{vmatrix} x+y+(x+y)&y+(x+y)+x&(x+y)+x+y\\y&x+y&x\\x+y&x&y \end{vmatrix}$
$D = \begin{vmatrix} 2x+2y&2x+2y&2x+2y\\y&x+y&x\\x+y&x&y \end{vmatrix}$
Take out the common factor $(2x+2y)$ from $R_1$.
$D = (2x+2y) \begin{vmatrix} 1&1&1\\y&x+y&x\\x+y&x&y \end{vmatrix}$
$D = 2(x+y) \begin{vmatrix} 1&1&1\\y&x+y&x\\x+y&x&y \end{vmatrix}$
Apply the column operations $C_2 \to C_2 - C_1$ and $C_3 \to C_3 - C_1$:
$D = 2(x+y) \begin{vmatrix} 1&1-1&1-1\\y&(x+y)-y&x-y\\x+y&x-(x+y)&y-(x+y) \end{vmatrix}$
$D = 2(x+y) \begin{vmatrix} 1&0&0\\y&x&x-y\\x+y&-y&-x \end{vmatrix}$
Expand the determinant along the first row ($R_1$):
$D = 2(x+y) \left[ 1 \cdot \begin{vmatrix} x&x-y\\-y&-x \end{vmatrix} - 0 + 0 \right]$
$D = 2(x+y) [ (x)(-x) - (x-y)(-y) ]$
$D = 2(x+y) [ -x^2 - (-xy + y^2) ]$
$D = 2(x+y) [ -x^2 + xy - y^2 ]$
Rearrange the terms inside the bracket:
$D = 2(x+y) [ -(x^2 - xy + y^2) ]$
$D = -2(x+y)(x^2 - xy + y^2)$
Recall the sum of cubes formula: $a^3 + b^3 = (a+b)(a^2 - ab + b^2)$.
The expression inside the bracket $x^2 - xy + y^2$ is part of this formula.
So, $(x+y)(x^2 - xy + y^2) = x^3 + y^3$.
$D = -2(x^3 + y^3)$
The value of the determinant is $\mathbf{-2(x^3 + y^3)}$.
Question 10. Evaluate $\begin{vmatrix} 1&x&y\\1&x+y&y\\1&x&x+y \end{vmatrix}$
Answer:
Let the given determinant be $D$.
$D = \begin{vmatrix} 1&x&y\\1&x+y&y\\1&x&x+y \end{vmatrix}$
Solution:
We can simplify the determinant by applying row operations to create zeros in the first column.
Apply the operation $R_2 \to R_2 - R_1$:
$D = \begin{vmatrix} 1&x&y\\1-1&(x+y)-x&y-y\\1&x&x+y \end{vmatrix}$
$D = \begin{vmatrix} 1&x&y\\0&y&0\\1&x&x+y \end{vmatrix}$
Apply the operation $R_3 \to R_3 - R_1$:
$D = \begin{vmatrix} 1&x&y\\0&y&0\\1-1&x-x&(x+y)-y \end{vmatrix}$
$D = \begin{vmatrix} 1&x&y\\0&y&0\\0&0&x \end{vmatrix}$
Now, the determinant is in upper triangular form. The value of an upper triangular determinant is the product of its diagonal elements.
Alternatively, we can expand along the first column ($C_1$) since it has two zeros:
$D = 1 \cdot \begin{vmatrix} y&0\\0&x \end{vmatrix} - 0 \cdot \begin{vmatrix} x&y\\0&x \end{vmatrix} + 0 \cdot \begin{vmatrix} x&y\\y&0 \end{vmatrix}$
$D = 1 \cdot ((y)(x) - (0)(0))$
$D = yx - 0$
$D = xy$
The value of the determinant is $\mathbf{xy}$.
Using properties of determinants in Exercises 11 to 15, prove that:
Question 11. $\begin{vmatrix} α&α^2&β+γ\\β&β^2&γ+α\\γ&γ^2&α+β \end{vmatrix}$ = (β – γ) (γ – α) (α – β) (α + β + γ)
Answer:
To Prove:
$\begin{vmatrix} α&α^2&β+γ\\β&β^2&γ+α\\γ&γ^2&α+β \end{vmatrix} = (\beta – \gamma) (\gamma – \alpha) (\alpha – \beta) (\alpha + \beta + \gamma)$
Proof:
Consider the Left Hand Side (LHS) determinant:
$LHS = \begin{vmatrix} α&α^2&β+γ\\β&β^2&γ+α\\γ&γ^2&α+β \end{vmatrix}$
Apply the column operation $C_3 \to C_3 + C_1$:
$LHS = \begin{vmatrix} α&α^2&β+γ+α\\β&β^2&γ+α+β\\γ&γ^2&α+β+γ \end{vmatrix}$
$LHS = \begin{vmatrix} α&α^2&α+β+γ\\β&β^2&α+β+γ\\γ&γ^2&α+β+γ \end{vmatrix}$
Take out the common factor $(\alpha+\beta+\gamma)$ from $C_3$:
$LHS = (\alpha+\beta+\gamma) \begin{vmatrix} α&α^2&1\\β&β^2&1\\γ&γ^2&1 \end{vmatrix}$
Apply the row operation $R_2 \to R_2 - R_1$ and $R_3 \to R_3 - R_1$:
$LHS = (\alpha+\beta+\gamma) \begin{vmatrix} α&α^2&1\\β-α&β^2-α^2&1-1\\γ-α&γ^2-α^2&1-1 \end{vmatrix}$
$LHS = (\alpha+\beta+\gamma) \begin{vmatrix} α&α^2&1\\β-α&(β-α)(β+α)&0\\γ-α&(γ-α)(γ+α)&0 \end{vmatrix}$
Expand the determinant along the third column ($C_3$), as it has two zeros:
$LHS = (\alpha+\beta+\gamma) \left[ 1 \cdot \begin{vmatrix} β-α&(β-α)(β+α)\\γ-α&(γ-α)(γ+α) \end{vmatrix} - 0 + 0 \right]$
$LHS = (\alpha+\beta+\gamma) \begin{vmatrix} β-α&(β-α)(β+α)\\γ-α&(γ-α)(γ+α) \end{vmatrix}$
Take out the common factor $(\beta-\alpha)$ from $R_1$ and $(\gamma-\alpha)$ from $R_2$ of the $2 \times 2$ determinant:
$LHS = (\alpha+\beta+\gamma) (\beta-\alpha) (\gamma-\alpha) \begin{vmatrix} 1&β+α\\1&γ+α \end{vmatrix}$
Evaluate the $2 \times 2$ determinant:
$\begin{vmatrix} 1&β+α\\1&γ+α \end{vmatrix} = 1(γ+α) - 1(β+α) = γ+α - β-α = γ - β$
Substitute this value back into the expression for LHS:
$LHS = (\alpha+\beta+\gamma) (\beta-\alpha) (\gamma-\alpha) (γ - β)$
We need to rearrange the terms to match the RHS $(\beta – \gamma) (γ – α) (α – β) (α + β + γ)$.
Notice that $(\beta-\alpha) = -(\alpha-\beta)$ and $(\gamma-\beta) = -(\beta-\gamma)$.
$LHS = (\alpha+\beta+\gamma) (-(\alpha-\beta)) (\gamma-\alpha) (-(\beta-\gamma))$
$LHS = (-1)(-1) (\alpha+\beta+\gamma) (\alpha-\beta) (\gamma-\alpha) (\beta-\gamma)$
$LHS = 1 \cdot (\alpha+\beta+\gamma) (\alpha-\beta) (\gamma-\alpha) (\beta-\gamma)$
$LHS = (\beta-\gamma) (\gamma-\alpha) (\alpha-\beta) (\alpha+\beta+\gamma)$
This is equal to the Right Hand Side (RHS).
Therefore, $\begin{vmatrix} α&α^2&β+γ\\β&β^2&γ+α\\γ&γ^2&α+β \end{vmatrix} = (\beta – γ) (γ – α) (α – β) (α + β + γ)$.
Hence, Proved.
Question 12. $\begin{vmatrix} x&x^2&1+px^3\\y&y^2&1+py^3\\z&z^2&1+pz^2 \end{vmatrix}$ = (1 + pxyz) (x – y) (y – z) (z – x), where p is any scalar.
Answer:
Let the given determinant be $D$. We are asked to prove that $D = (1 + pxyz) (x – y) (y – z) (z – x)$.
$D = \begin{vmatrix} x&x^2&1+px^3\\y&y^2&1+py^3\\z&z^2&1+pz^2 \end{vmatrix}$
Using the property that if elements of any column (or row) of a determinant are expressed as the sum of two or more terms, then the determinant can be expressed as the sum of two or more determinants. We split the determinant based on the third column ($C_3$):
$D = \begin{vmatrix} x&x^2&1\\y&y^2&1\\z&z^2&1 \end{vmatrix} + \begin{vmatrix} x&x^2&px^3\\y&y^2&py^3\\z&z^2&pz^3 \end{vmatrix}$
Let $D_1 = \begin{vmatrix} x&x^2&1\\y&y^2&1\\z&z^2&1 \end{vmatrix}$ and $D_2 = \begin{vmatrix} x&x^2&px^3\\y&y^2&py^3\\z&z^2&pz^3 \end{vmatrix}$.
Consider $D_1$:
$D_1 = \begin{vmatrix} x&x^2&1\\y&y^2&1\\z&z^2&1 \end{vmatrix}$
This is a form of the Vandermonde determinant. By swapping columns $C_1 \leftrightarrow C_3$ and then $C_2 \leftrightarrow C_3$, we get the standard form, noting that each swap introduces a factor of $-1$:
$D_1 = (-1) \begin{vmatrix} 1&x^2&x\\1&y^2&y\\1&z^2&z \end{vmatrix} = (-1)(-1) \begin{vmatrix} 1&x&x^2\\1&y&y^2\\1&z&z^2 \end{vmatrix} = \begin{vmatrix} 1&x&x^2\\1&y&y^2\\1&z&z^2 \end{vmatrix}$
The value of the standard Vandermonde determinant $\begin{vmatrix} 1&a&a^2\\1&b&b^2\\1&c&c^2 \end{vmatrix}$ is $(b-a)(c-a)(c-b)$.
Applying this formula to $D_1$:
$D_1 = (y-x)(z-x)(z-y)$
We can rewrite this in terms of $(x-y)$, $(y-z)$, and $(z-x)$:
$D_1 = (-(x-y)) (z-x) (-(y-z)) = (x-y)(y-z)(z-x)$
Now consider $D_2$:
$D_2 = \begin{vmatrix} x&x^2&px^3\\y&y^2&py^3\\z&z^2&pz^3 \end{vmatrix}$
Factor out $p$ from the third column ($C_3$):
$D_2 = p \begin{vmatrix} x&x^2&x^3\\y&y^2&y^3\\z&z^2&z^3 \end{vmatrix}$
Now, factor out $x$ from the first row ($R_1$), $y$ from the second row ($R_2$), and $z$ from the third row ($R_3$):
$D_2 = pxyz \begin{vmatrix} 1&x&x^2\\1&y&y^2\\1&z&z^2 \end{vmatrix}$
The determinant $\begin{vmatrix} 1&x&x^2\\1&y&y^2\\1&z&z^2 \end{vmatrix}$ is the standard Vandermonde determinant, which equals $(y-x)(z-x)(z-y)$.
So, $D_2 = pxyz (y-x)(z-x)(z-y)$.
Rewriting the factors:
$D_2 = pxyz (-(x-y)) (z-x) (-(y-z)) = pxyz (x-y)(y-z)(z-x)$
The original determinant $D$ is the sum of $D_1$ and $D_2$:
$D = D_1 + D_2$
$D = (x-y)(y-z)(z-x) + pxyz (x-y)(y-z)(z-x)$
Factor out the common term $(x-y)(y-z)(z-x)$:
$D = \left(1 + pxyz\right) (x-y)(y-z)(z-x)$
This is the required expression.
Hence, it is proved that $\begin{vmatrix} x&x^2&1+px^3\\y&y^2&1+py^3\\z&z^2&1+pz^2 \end{vmatrix}$ = (1 + pxyz) (x – y) (y – z) (z – x).
Question 13. $\begin{vmatrix} 3a&−a+b&−a+c\\−b+a&3b&−b+c\\−c+a&−c+b&3c \end{vmatrix} = 3(a + b + c) (ab + bc + ca)$
Answer:
Given:
The determinant $\begin{vmatrix} 3a&−a+b&−a+c\\−b+a&3b&−b+c\\−c+a&−c+b&3c \end{vmatrix}$
To Prove:
$\begin{vmatrix} 3a&−a+b&−a+c\\−b+a&3b&−b+c\\−c+a&−c+b&3c \end{vmatrix} = 3(a + b + c) (ab + bc + ca)$
Proof:
Let the given determinant be $D$.
$D = \begin{vmatrix} 3a&−a+b&−a+c\\−b+a&3b&−b+c\\−c+a&−c+b&3c \end{vmatrix}$
Apply the column operation $C_1 \to C_1 + C_2 + C_3$:
The elements of the new first column are:
First element: $3a + (-a+b) + (-a+c) = 3a - a + b - a + c = a+b+c$
Second element: $(-b+a) + 3b + (-b+c) = -b + a + 3b - b + c = a+b+c$
Third element: $(-c+a) + (-c+b) + 3c = -c + a - c + b + 3c = a+b+c$
So, the determinant becomes:
$D = \begin{vmatrix} a+b+c&−a+b&−a+c\\a+b+c&3b&−b+c\\a+b+c&−c+b&3c \end{vmatrix}$
Factor out $(a+b+c)$ from the first column ($C_1$):
$D = (a+b+c) \begin{vmatrix} 1&−a+b&−a+c\\1&3b&−b+c\\1&−c+b&3c \end{vmatrix}$
Apply the row operations $R_2 \to R_2 - R_1$ and $R_3 \to R_3 - R_1$ to create zeros in the first column:
For $R_2 \to R_2 - R_1$:
First element: $1 - 1 = 0$
Second element: $3b - (-a+b) = 3b + a - b = a+2b$
Third element: $(-b+c) - (-a+c) = -b+c+a-c = a-b$
For $R_3 \to R_3 - R_1$:
First element: $1 - 1 = 0$
Second element: $(-c+b) - (-a+b) = -c+b+a-b = a-c$
Third element: $3c - (-a+c) = 3c+a-c = a+2c$
The determinant is now:
$D = (a+b+c) \begin{vmatrix} 1&−a+b&−a+c\\0&a+2b&a-b\\0&a-c&a+2c \end{vmatrix}$
Expand the determinant along the first column ($C_1$):
$D = (a+b+c) \times 1 \times \begin{vmatrix} a+2b&a-b\\a-c&a+2c \end{vmatrix} - 0 + 0$
Evaluate the $2 \times 2$ determinant:
$\begin{vmatrix} a+2b&a-b\\a-c&a+2c \end{vmatrix} = (a+2b)(a+2c) - (a-b)(a-c)$
$= (a^2 + 2ac + 2ab + 4bc) - (a^2 - ac - ab + bc)$
$= a^2 + 2ac + 2ab + 4bc - a^2 + ac + ab - bc$
$= (a^2 - a^2) + (2ac + ac) + (2ab + ab) + (4bc - bc)$
$= 0 + 3ac + 3ab + 3bc$
$= 3ab + 3bc + 3ca$
$= 3(ab + bc + ca)$
Substitute this result back into the expression for $D$:
$D = (a+b+c) \times 3(ab + bc + ca)$
$D = 3(a+b+c)(ab + bc + ca)$
This matches the right-hand side of the given equation.
Hence, it is proved that $\begin{vmatrix} 3a&−a+b&−a+c\\−b+a&3b&−b+c\\−c+a&−c+b&3c \end{vmatrix} = 3(a + b + c) (ab + bc + ca)$.
Question 14. $\begin{vmatrix} 1 &1+p&1+p+q\\2&3+2p&4+3p+2q\\3&6+3p&10+6p+3q \end{vmatrix} = 1$
Answer:
Given:
The determinant $\begin{vmatrix} 1 &1+p&1+p+q\\2&3+2p&4+3p+2q\\3&6+3p&10+6p+3q \end{vmatrix}$
To Prove:
$\begin{vmatrix} 1 &1+p&1+p+q\\2&3+2p&4+3p+2q\\3&6+3p&10+6p+3q \end{vmatrix} = 1$
Proof:
Let the given determinant be $D$.
$D = \begin{vmatrix} 1 &1+p&1+p+q\\2&3+2p&4+3p+2q\\3&6+3p&10+6p+3q \end{vmatrix}$
Apply the row operation $R_2 \to R_2 - 2R_1$. The new elements in the second row will be:
$2 - 2(1) = 0$
$(3+2p) - 2(1+p) = 3+2p - 2 - 2p = 1$
$(4+3p+2q) - 2(1+p+q) = 4+3p+2q - 2 - 2p - 2q = 2+p$
So the determinant becomes:
$D = \begin{vmatrix} 1 &1+p&1+p+q\\0&1&2+p\\3&6+3p&10+6p+3q \end{vmatrix}$
Now, apply the row operation $R_3 \to R_3 - 3R_1$. The new elements in the third row will be:
$3 - 3(1) = 0$
$(6+3p) - 3(1+p) = 6+3p - 3 - 3p = 3$
$(10+6p+3q) - 3(1+p+q) = 10+6p+3q - 3 - 3p - 3q = 7+3p$
The determinant is now:
$D = \begin{vmatrix} 1 &1+p&1+p+q\\0&1&2+p\\0&3&7+3p \end{vmatrix}$
Expand the determinant along the first column ($C_1$), as it contains two zeros. The expansion is $1 \times (\text{minor of element at } R_1, C_1) - 0 \times (\text{minor}) + 0 \times (\text{minor})$.
$D = 1 \times \begin{vmatrix} 1&2+p\\3&7+3p \end{vmatrix}$
Now, evaluate the $2 \times 2$ determinant:
$\begin{vmatrix} 1&2+p\\3&7+3p \end{vmatrix} = (1)(7+3p) - (3)(2+p)$
$= (7+3p) - (6+3p)$
$= 7+3p - 6 - 3p$
$= (7-6) + (3p-3p)$
$= 1 + 0$
$= 1$
So, the value of the determinant $D$ is $1$.
$D = 1$
This matches the right-hand side of the equation we were asked to prove.
Hence, it is proved that $\begin{vmatrix} 1 &1+p&1+p+q\\2&3+2p&4+3p+2q\\3&6+3p&10+6p+3q \end{vmatrix} = 1$.
Question 15. $\begin{vmatrix} \sinα&\cosα&\cos(α+δ)\\\sinβ&\cosβ&\cos(β+δ)\\\sinγ&\cosγ&\cos(γ+δ) \end{vmatrix} = 0$
Answer:
Given:
The determinant $\begin{vmatrix} \sinα&\cosα&\cos(α+δ)\\\sinβ&\cosβ&\cos(β+δ)\\\sinγ&\cosγ&\cos(γ+δ) \end{vmatrix}$
To Prove:
$\begin{vmatrix} \sinα&\cosα&\cos(α+δ)\\\sinβ&\cosβ&\cos(β+δ)\\\sinγ&\cosγ&\cos(γ+δ) \end{vmatrix} = 0$
Proof:
Let the given determinant be $D$.
$D = \begin{vmatrix} \sinα&\cosα&\cos(α+δ)\\\sinβ&\cosβ&\cos(β+δ)\\\sinγ&\cosγ&\cos(γ+δ) \end{vmatrix}$
Using the trigonometric identity $\cos(A+B) = \cos A \cos B - \sin A \sin B$, we expand the elements in the third column ($C_3$).
The elements are:
$\cos(α+δ) = \cosα \cosδ - \sinα \sinδ$
$\cos(β+δ) = \cosβ \cosδ - \sinβ \sinδ$
$\cos(γ+δ) = \cosγ \cosδ - \sinγ \sinδ$
So the determinant becomes:
$D = \begin{vmatrix} \sinα&\cosα&\cosα \cosδ - \sinα \sinδ\\\sinβ&\cosβ&\cosβ \cosδ - \sinβ \sinδ\\\sinγ&\cosγ&\cosγ \cosδ - \sinγ \sinδ \end{vmatrix}$
Using the property that if elements of any column (or row) of a determinant are expressed as the sum of two or more terms, then the determinant can be expressed as the sum of two or more determinants. We split the determinant based on the third column ($C_3$):
$D = \begin{vmatrix} \sinα&\cosα&\cosα \cosδ\\\sinβ&\cosβ&\cosβ \cosδ\\\sinγ&\cosγ&\cosγ \cosδ \end{vmatrix} + \begin{vmatrix} \sinα&\cosα&- \sinα \sinδ\\\sinβ&\cosβ&- \sinβ \sinδ\\\sinγ&\cosγ&- \sinγ \sinδ \end{vmatrix}$
Note the minus sign in the second determinant which comes from the expansion of $\cos(A+B)$.
Let's consider the first determinant: $\begin{vmatrix} \sinα&\cosα&\cosα \cosδ\\\sinβ&\cosβ&\cosβ \cosδ\\\sinγ&\cosγ&\cosγ \cosδ \end{vmatrix}$.
Factor out $\cosδ$ from the third column ($C_3$):
$\cosδ \begin{vmatrix} \sinα&\cosα&\cosα\\\sinβ&\cosβ&\cosβ\\\sinγ&\cosγ&\cosγ \end{vmatrix}$
In this determinant, the second column ($C_2$) and the third column ($C_3$) are identical. If any two columns (or rows) of a determinant are identical, the value of the determinant is zero.
So, $\begin{vmatrix} \sinα&\cosα&\cosα\\\sinβ&\cosβ&\cosβ\\\sinγ&\cosγ&\cosγ \end{vmatrix} = 0$.
Thus, the first part of the sum is $\cosδ \times 0 = 0$.
Now, let's consider the second determinant: $\begin{vmatrix} \sinα&\cosα&- \sinα \sinδ\\\sinβ&\cosβ&- \sinβ \sinδ\\\sinγ&\cosγ&- \sinγ \sinδ \end{vmatrix}$.
Factor out $- \sinδ$ from the third column ($C_3$):
$- \sinδ \begin{vmatrix} \sinα&\cosα&\sinα\\\sinβ&\cosβ&\sinβ\\\sinγ&\cosγ&\sinγ \end{vmatrix}$
In this determinant, the first column ($C_1$) and the third column ($C_3$) are identical. Therefore, its value is zero.
So, $\begin{vmatrix} \sinα&\cosα&\sinα\\\sinβ&\cosβ&\sinβ\\\sinγ&\cosγ&\sinγ \end{vmatrix} = 0$.
Thus, the second part of the sum is $- \sinδ \times 0 = 0$.
Adding the values of the two determinants:
$D = 0 + 0 = 0$
Thus, we have shown that the value of the determinant is $0$.
Hence, it is proved that $\begin{vmatrix} \sinα&\cosα&\cos(α+δ)\\\sinβ&\cosβ&\cos(β+δ)\\\sinγ&\cosγ&\cos(γ+δ) \end{vmatrix} = 0$.
Question 16. Solve the system of equations
$\frac{2}{x}$ + $\frac{3}{y}$ + $\frac{10}{z}$ = 4
$\frac{4}{x}$ - $\frac{6}{y}$ + $\frac{5}{z}$ = 1
$\frac{6}{x}$ + $\frac{9}{y}$ - $\frac{20}{z}$ = 2
Answer:
Given:
The system of equations:
$\frac{2}{x}$ + $\frac{3}{y}$ + $\frac{10}{z}$ = 4
$\frac{4}{x}$ - $\frac{6}{y}$ + $\frac{5}{z}$ = 1
$\frac{6}{x}$ + $\frac{9}{y}$ - $\frac{20}{z}$ = 2
To Find:
The values of $x$, $y$, and $z$ that satisfy the given system of equations.
Solution:
The given equations involve the reciprocals of $x$, $y$, and $z$. Let's make a substitution to convert this into a linear system.
Let $u = \frac{1}{x}$, $v = \frac{1}{y}$, and $w = \frac{1}{z}$.
Substituting these into the given equations, we get a linear system in terms of $u$, $v$, and $w$:
$2u + 3v + 10w = 4$
$4u - 6v + 5w = 1$
$6u + 9v - 20w = 2$
We can solve this linear system using Cramer's rule. First, we write the coefficient matrix $A$ and the constant vector $B$:
$A = \begin{pmatrix} 2 & 3 & 10 \\ 4 & -6 & 5 \\ 6 & 9 & -20 \end{pmatrix}$
$B = \begin{pmatrix} 4 \\ 1 \\ 2 \end{pmatrix}$
Calculate the determinant of the coefficient matrix, $D = \det(A)$:
$D = \begin{vmatrix} 2 & 3 & 10 \\ 4 & -6 & 5 \\ 6 & 9 & -20 \end{vmatrix}$
Expand along the first row:
$D = 2 \begin{vmatrix} -6 & 5 \\ 9 & -20 \end{vmatrix} - 3 \begin{vmatrix} 4 & 5 \\ 6 & -20 \end{vmatrix} + 10 \begin{vmatrix} 4 & -6 \\ 6 & 9 \end{vmatrix}$
$D = 2((-6)(-20) - (5)(9)) - 3((4)(-20) - (5)(6)) + 10((4)(9) - (-6)(6))$
$D = 2(120 - 45) - 3(-80 - 30) + 10(36 + 36)$
$D = 2(75) - 3(-110) + 10(72)$
$D = 150 + 330 + 720$
$D = 1200$
Since $D \neq 0$, the system has a unique solution.
Calculate $D_u$ by replacing the first column of $A$ with the constant vector $B$:
$D_u = \begin{vmatrix} 4 & 3 & 10 \\ 1 & -6 & 5 \\ 2 & 9 & -20 \end{vmatrix}$
Expand along the first row:
$D_u = 4 \begin{vmatrix} -6 & 5 \\ 9 & -20 \end{vmatrix} - 3 \begin{vmatrix} 1 & 5 \\ 2 & -20 \end{vmatrix} + 10 \begin{vmatrix} 1 & -6 \\ 2 & 9 \end{vmatrix}$
$D_u = 4(120 - 45) - 3(-20 - 10) + 10(9 + 12)$
$D_u = 4(75) - 3(-30) + 10(21)$
$D_u = 300 + 90 + 210$
$D_u = 600$
Calculate $D_v$ by replacing the second column of $A$ with the constant vector $B$:
$D_v = \begin{vmatrix} 2 & 4 & 10 \\ 4 & 1 & 5 \\ 6 & 2 & -20 \end{vmatrix}$
Expand along the first row:
$D_v = 2 \begin{vmatrix} 1 & 5 \\ 2 & -20 \end{vmatrix} - 4 \begin{vmatrix} 4 & 5 \\ 6 & -20 \end{vmatrix} + 10 \begin{vmatrix} 4 & 1 \\ 6 & 2 \end{vmatrix}$
$D_v = 2(-20 - 10) - 4(-80 - 30) + 10(8 - 6)$
$D_v = 2(-30) - 4(-110) + 10(2)$
$D_v = -60 + 440 + 20$
$D_v = 400$
Calculate $D_w$ by replacing the third column of $A$ with the constant vector $B$:
$D_w = \begin{vmatrix} 2 & 3 & 4 \\ 4 & -6 & 1 \\ 6 & 9 & 2 \end{vmatrix}$
Expand along the first row:
$D_w = 2 \begin{vmatrix} -6 & 1 \\ 9 & 2 \end{vmatrix} - 3 \begin{vmatrix} 4 & 1 \\ 6 & 2 \end{vmatrix} + 4 \begin{vmatrix} 4 & -6 \\ 6 & 9 \end{vmatrix}$
$D_w = 2(-12 - 9) - 3(8 - 6) + 4(36 + 36)$
$D_w = 2(-21) - 3(2) + 4(72)$
$D_w = -42 - 6 + 288$
$D_w = 240$
Now, we find the values of $u$, $v$, and $w$ using Cramer's rule:
$u = \frac{D_u}{D} = \frac{600}{1200} = \frac{1}{2}$
$v = \frac{D_v}{D} = \frac{400}{1200} = \frac{1}{3}$
$w = \frac{D_w}{D} = \frac{240}{1200} = \frac{1}{5}$
Finally, substitute back to find $x$, $y$, and $z$ using the original substitutions $u = \frac{1}{x}$, $v = \frac{1}{y}$, and $w = \frac{1}{z}$:
$\frac{1}{x} = u \implies \frac{1}{x} = \frac{1}{2} \implies x = 2$
$\frac{1}{y} = v \implies \frac{1}{y} = \frac{1}{3} \implies y = 3$
$\frac{1}{z} = w \implies \frac{1}{z} = \frac{1}{5} \implies z = 5$
The solution to the system of equations is $x=2$, $y=3$, and $z=5$.
Verification:
Substitute the values into the original equations:
Eq 1: $\frac{2}{2} + \frac{3}{3} + \frac{10}{5} = 1 + 1 + 2 = 4$ (Holds)
Eq 2: $\frac{4}{2} - \frac{6}{3} + \frac{5}{5} = 2 - 2 + 1 = 1$ (Holds)
Eq 3: $\frac{6}{2} + \frac{9}{3} - \frac{20}{5} = 3 + 3 - 4 = 2$ (Holds)
The solution is verified.
Choose the correct answer in Exercise 17 to 19.
Question 17. If a, b, c, are in A.P, then the determinant
$\begin{vmatrix} x+2&x+3&x+2a\\x+3&x+4&x+2b\\x+4&x+5&x+2c \end{vmatrix}$ is
(A) 0
(B) 1
(C) x
(D) 2x
Answer:
Given:
The determinant $\begin{vmatrix} x+2&x+3&x+2a\\x+3&x+4&x+2b\\x+4&x+5&x+2c \end{vmatrix}$.
The numbers $a$, $b$, and $c$ are in Arithmetic Progression (A.P.).
To Find:
The value of the given determinant.
Solution:
Let the given determinant be $D$.
$D = \begin{vmatrix} x+2&x+3&x+2a\\x+3&x+4&x+2b\\x+4&x+5&x+2c \end{vmatrix}$
Since $a$, $b$, and $c$ are in A.P., the difference between consecutive terms is constant. This means:
$b - a = c - b$
(Property of A.P.)
This equality implies $2b = a+c$. Also, from $b-a = c-b$, we have $2(b-a) = 2(c-b)$.
Apply the row operations $R_2 \to R_2 - R_1$ and $R_3 \to R_3 - R_2$ to simplify the determinant.
For the new second row ($R_2'$):
- First element: $(x+3) - (x+2) = 1$
- Second element: $(x+4) - (x+3) = 1$
- Third element: $(x+2b) - (x+2a) = 2b - 2a = 2(b-a)$
For the new third row ($R_3'$):
- First element: $(x+4) - (x+3) = 1$
- Second element: $(x+5) - (x+4) = 1$
- Third element: $(x+2c) - (x+2b) = 2c - 2b = 2(c-b)$
The determinant becomes:
$D = \begin{vmatrix} x+2&x+3&x+2a\\1&1&2(b-a)\\1&1&2(c-b) \end{vmatrix}$
As $a, b, c$ are in A.P., we know that $b-a = c-b$. Let this common difference be $d$, so $b-a = d$ and $c-b = d$.
Substitute this into the determinant:
$D = \begin{vmatrix} x+2&x+3&x+2a\\1&1&2d\\1&1&2d \end{vmatrix}$
Observe that the second row ($R_2$) and the third row ($R_3$) of this determinant are identical.
A fundamental property of determinants states that if any two rows (or columns) of a determinant are identical, the value of the determinant is zero.
Therefore, $D = 0$.
The value of the determinant is $0$. This corresponds to option (A).
The final answer is (A) 0.
Question 18. If x, y, z are nonzero real numbers, then the inverse of matrix A = $\begin{bmatrix}x&0&0\\0&y&0\\0&0&z \end{bmatrix}$ is
(A) $\begin{bmatrix}x^{−1}&0&0\\0&y^{−1}&0\\0&0&z^{−1} \end{bmatrix}$
(B) $xyz \begin{bmatrix}x^{−1}&0&0\\0&y^{−1}&0\\0&0&z^{−1} \end{bmatrix}$
(C) $\frac{1}{xyz} \begin{bmatrix}x&0&0\\0&y&0\\0&0&z \end{bmatrix}$
(D) $\frac{1}{xyz} \begin{bmatrix}1&0&0\\0&1&0\\0&0&1 \end{bmatrix}$
Answer:
Given:
The matrix $A = \begin{bmatrix}x&0&0\\0&y&0\\0&0&z \end{bmatrix}$, where $x, y, z$ are nonzero real numbers.
To Find:
The inverse of matrix $A$, denoted as $A^{-1}$.
Solution:
The inverse of a square matrix $A$ exists if and only if its determinant is non-zero. The formula for the inverse is given by $A^{-1} = \frac{1}{\det(A)} \text{adj}(A)$, where $\det(A)$ is the determinant of $A$ and $\text{adj}(A)$ is the adjoint of $A$.
First, calculate the determinant of matrix $A$:
$\det(A) = \begin{vmatrix} x&0&0\\0&y&0\\0&0&z \end{vmatrix}$
Expanding along the first row:
$\det(A) = x \begin{vmatrix} y&0\\0&z \end{vmatrix} - 0 \begin{vmatrix} 0&0\\0&z \end{vmatrix} + 0 \begin{vmatrix} 0&y\\0&0 \end{vmatrix}$
$\det(A) = x(yz - 0) - 0 + 0 = xyz$
Since $x, y, z$ are nonzero, $xyz \neq 0$, so $\det(A) \neq 0$. Thus, the inverse of matrix $A$ exists.
Next, calculate the adjoint of matrix $A$. The adjoint is the transpose of the cofactor matrix. The cofactor $C_{ij}$ is given by $C_{ij} = (-1)^{i+j} M_{ij}$, where $M_{ij}$ is the minor of the element at position $(i, j)$.
The cofactor matrix of $A$ is:
$C_{11} = (-1)^{1+1} \begin{vmatrix} y&0\\0&z \end{vmatrix} = yz$
$C_{12} = (-1)^{1+2} \begin{vmatrix} 0&0\\0&z \end{vmatrix} = 0$
$C_{13} = (-1)^{1+3} \begin{vmatrix} 0&y\\0&0 \end{vmatrix} = 0$
$C_{21} = (-1)^{2+1} \begin{vmatrix} 0&0\\0&z \end{vmatrix} = 0$
$C_{22} = (-1)^{2+2} \begin{vmatrix} x&0\\0&z \end{vmatrix} = xz$
$C_{23} = (-1)^{2+3} \begin{vmatrix} x&0\\0&0 \end{vmatrix} = 0$
$C_{31} = (-1)^{3+1} \begin{vmatrix} 0&0\\y&0 \end{vmatrix} = 0$
$C_{32} = (-1)^{3+2} \begin{vmatrix} x&0\\0&0 \end{vmatrix} = 0$
$C_{33} = (-1)^{3+3} \begin{vmatrix} x&0\\0&y \end{vmatrix} = xy$
The cofactor matrix is $C = \begin{bmatrix} yz&0&0\\0&xz&0\\0&0&xy \end{bmatrix}$.
The adjoint matrix is $\text{adj}(A) = C^T = \begin{bmatrix} yz&0&0\\0&xz&0\\0&0&xy \end{bmatrix}$.
Now, calculate the inverse $A^{-1} = \frac{1}{\det(A)} \text{adj}(A)$:
$A^{-1} = \frac{1}{xyz} \begin{bmatrix} yz&0&0\\0&xz&0\\0&0&xy \end{bmatrix}$
Multiply each element by $\frac{1}{xyz}$:
$A^{-1} = \begin{bmatrix} \frac{yz}{xyz}&0&0\\0&\frac{xz}{xyz}&0\\0&0&\frac{xy}{xyz} \end{bmatrix} = \begin{bmatrix} \frac{1}{x}&0&0\\0&\frac{1}{y}&0\\0&0&\frac{1}{z} \end{bmatrix}$
Using the notation $x^{-1} = \frac{1}{x}$, $y^{-1} = \frac{1}{y}$, and $z^{-1} = \frac{1}{z}$, the inverse matrix is:
$A^{-1} = \begin{bmatrix} x^{-1}&0&0\\0&y^{-1}&0\\0&0&z^{-1} \end{bmatrix}$
Comparing this result with the given options, we see that it matches option (A).
The final answer is (A) $\begin{bmatrix}x^{−1}&0&0\\0&y^{−1}&0\\0&0&z^{−1} \end{bmatrix}$.
Question 19. Let A = $\begin{bmatrix}1&\sinθ&1\\−\sinθ&1&\sinθ\\−1&−\sinθ&1 \end{bmatrix}$ , where 0 ≤ θ ≤ 2π. Then
(A) Det(A) = 0
(B) Det(A) ∈ (2, ∞)
(C) Det(A) ∈ (2, 4)
(D) Det(A) ∈ [2, 4]
Answer:
Given:
The matrix $A = \begin{bmatrix}1&\sinθ&1\\−\sinθ&1&\sinθ\\−1&−\sinθ&1 \end{bmatrix}$.
The range of $\theta$ is $0 \leq θ \leq 2π$.
To Find:
The range of values for $\det(A)$.
Solution:
Let's calculate the determinant of matrix $A$. We expand the determinant along the first row ($R_1$).
$\det(A) = 1 \times \begin{vmatrix} 1&\sinθ\\−\sinθ&1 \end{vmatrix} - \sinθ \times \begin{vmatrix} −\sinθ&\sinθ\\−1&1 \end{vmatrix} + 1 \times \begin{vmatrix} −\sinθ&1\\−1&−\sinθ \end{vmatrix}$
Now, we evaluate the $2 \times 2$ determinants:
$\begin{vmatrix} 1&\sinθ\\−\sinθ&1 \end{vmatrix} = (1)(1) - (\sinθ)(-\sinθ) = 1 + \sin^2θ$
$\begin{vmatrix} −\sinθ&\sinθ\\−1&1 \end{vmatrix} = (-\sinθ)(1) - (\sinθ)(-1) = -\sinθ + \sinθ = 0$
$\begin{vmatrix} −\sinθ&1\\−1&−\sinθ \end{vmatrix} = (-\sinθ)(-\sinθ) - (1)(-1) = \sin^2θ + 1$
Substitute these values back into the expression for $\det(A)$:
$\det(A) = 1 \times (1 + \sin^2θ) - \sinθ \times (0) + 1 \times (\sin^2θ + 1)$
$\det(A) = (1 + \sin^2θ) - 0 + (\sin^2θ + 1)$
$\det(A) = 1 + \sin^2θ + \sin^2θ + 1$
$\det(A) = 2 + 2\sin^2θ$
We are given that $0 \leq θ \leq 2π$. For any real value of $\theta$, the value of $\sinθ$ is in the range $[-1, 1]$.
$-1 \leq \sinθ \leq 1$
Squaring the value of $\sinθ$, the range of $\sin^2θ$ is from $0$ (when $\sin\theta = 0$) to $1$ (when $\sin\theta = \pm 1$).
$0 \leq \sin^2θ \leq 1$
Now, we find the range of $2\sin^2θ$ by multiplying the inequality by 2:
$2 \times 0 \leq 2\sin^2θ \leq 2 \times 1$
$0 \leq 2\sin^2θ \leq 2$
Finally, we find the range of $\det(A) = 2 + 2\sin^2θ$ by adding 2 to all parts of the inequality:
$2 + 0 \leq 2 + 2\sin^2θ \leq 2 + 2$
$2 \leq \det(A) \leq 4$
So, the value of $\det(A)$ is in the closed interval $[2, 4]$.
Comparing this result with the given options:
- (A) $\det(A) = 0$ is incorrect.
- (B) $\det(A) ∈ (2, ∞)$ is incorrect, as the maximum value is 4.
- (C) $\det(A) ∈ (2, 4)$ is incorrect, as $\det(A)$ can be equal to 2 (when $\sinθ=0$) and 4 (when $\sin\theta=\pm 1$).
- (D) $\det(A) ∈ [2, 4]$ is correct, as the range includes both 2 and 4.
The final answer is (D) Det(A) ∈ [2, 4].